Open Access

Staying connected with ICT tools: tracking youth respondents in a Chinese context

The Journal of Chinese Sociology20185:1

Received: 15 June 2017

Accepted: 30 January 2018

Published: 6 March 2018


Young people have been perceived as a group that benefits most directly from China’s education aspirations and modernization campaign. Given their high social and spatial mobility related to education and career pursuits, it is intriguing but also difficult to follow them in panel studies. This methodology paper summarizes how the Panel Study of Nanjing High School Graduates explores different ways to track youth respondents with the aid of information and communication technology (ICT) tools. Rather than examining the tracking outcomes and attrition rates from a quantitative perspective, this paper discusses tracking strategies from a qualitative perspective based on a continuously updated understanding of youth development. The findings suggest that the use of ICT tools does not necessarily lead to the cooperation of youth respondents, and researchers have to make deliberate choices about how to use such tools to embrace diversity, enhance trust, and show respect for privacy. Given this overloaded information era and the ever-changing youth culture, this study explores the advantages and limits of ICT tools that require researchers’ local and contextualized knowledge to appropriate and combine different ICT tools in order to deal with the distrust and noncooperation of respondents in a virtual platform rather than merely relying on convenient ICT tools and material incentives.


Panel studyICTYouthChina


China’s economic development has led to increasing social and spatial mobility, which poses unprecedented challenges for panel studies. Even if respondents are reachable, they may not be “available” to complete the surveys, that is, willing to invest the time and energy to participate in the study. Admittedly, information and communication technology (ICT) tools have made it easier to access and update the contact information of respondents in a timely manner at low cost. ICT tools refer to the electronic devices and applications that “facilitate the transfer of information and various types of electronically mediated communications” (Zuppo 2012), including telephones, email, computers and so on. However, people are more selective in reading emails, picking up phone calls, and accepting online-chat requests. Due to the lack of trust, online and virtual communication does not always elicit smooth conversations but can lead to feelings of being interrupted and harassed. This paper revisits the long-standing issue of the reachability and availability of respondents, contextualized in this new era of social mobility and information technology.

This paper reports on the use of ICT tools for tracking young respondents in a panel study that faced two major challenges. First, the difficulties in tracking youth respondents were highly conditioned by the life course. Respondents were high school students at the time of the first wave and were followed after they graduated. They experienced rapid changes in their lives and often gained a sense of autonomy and self-determination. Second, the aid of ICT tools is still embedded in the Chinese social understanding of “closeness” and “distance.” Although China has witnessed an individualization process that unties individuals from traditional family and kinship institutions (Yan 2011), the importance of social connections still remains, not only for achieving instrumental goals but also reflecting affect and morality (Osburg 2013). Without such connections, respondents can easily turn down requests from those they neither know nor trust. These are the challenges that longitudinal designs have to take into consideration.

Panel studies and tracking strategies

Panel studies have been widely used for studying time-varying factors and relationships. However, the loss of cases in follow-up surveys is a common issue, often due to the death, disability, dispersion, and migration of the respondents (Ribisl et al. 1996). Conventional methods of home visits and telephone interviews have become less effective in dealing with such uncertainties (Feng 2006), and the attrition rates of longitudinal studies can range between 40 and 70% (Ribisl et al. 1996). In some cases respondents who “disappear” tend to have antisocial or deviant tendencies (Cotter et al. 2005), but respondents are also likely to quit due to difficult questions (Navratil et al. 1994), long interviews, low remuneration (Stouthamer-Loeber et al., 1992), and an uncomfortable atmosphere (Cotter et al. 2002). This reluctance can be a temporary problem, unique to each individual, or solved on a case-by-case basis (Navratil et al. 1994). Recent research suggests the need to contextualize tracking strategies in social settings (Wang et al., 2014), which is crucial in retaining respondents.

Economic incentives or financial rewards are often used in panel studies to motivate follow-up participation (Ribisl et al. 1996). However, economic incentives may not be sufficiently effective in an urban context and among better-off people (Sun et al. 2011; Sun 2012). From another perspective, economic incentives may not be the only factor; noneconomic incentives may also encourage people to contribute to the survey, sometimes in combination with economic incentives. Furthermore, panel studies have to design proper incentives and the forms of “giving” carefully, such as using music vouchers and prize draws (Boys et al. 2003). Some studies send gift vouchers together with a thank-you letter and a change-of-address card to encourage respondents to keep their address updated between interview points (Laurie and Scott 1999). For those respondents who do not have mailing addresses such as homeless people, researchers have tried buying them food and drinks and rely on homeless shelters for the tracking process (Ribisl et al. 1996). Researchers continue to emphasize good rapport or regular correspondence (Navratil et al. 1994), but it can be too costly to follow and retain “difficult” participants, and researchers need to weigh the costs versus the loss caused by attrition (Ribisl et al. 1996).

Given the cost concern and technological development, many panel studies have adopted survey methods using cell phones and the Internet (see special issues of Public Opinion Quarterly 71, no. 5 [2007]; 72, no. 5 [2008]). ICT tools have gained greater importance since face-to-face interviews have become more expensive and many families now use only cell phones instead of land lines (Couper and Miller 2008). However, the use of ICT tools may be constrained by their own nature: for example, the Internet user population can differ from that of telephone users (ibid.) and this introduces new risks of possible biases. The issues of nonobservation and representation cannot be easily solved by statistical procedures such as endpoint analyses, time-controlled analyses, and regression techniques to address the loss of missing subjects (Flick 1988). In sum, effectiveness and appropriateness of both economic incentives and ICT tools in panel studies need to be evaluated.

Many panel studies in China continue to rely on conventional survey methods such as household visits and face-to-face interviews, and some have examined the attrition patterns. In the China Family Panel Study (CFPS), the response rate was positively related to education and negatively related to urban residence and income levels (Sun et al. 2011; Sun 2012). This makes urban and educated youth a complicated group to study because they tend to have more knowledge about social research but are also influenced by the “indifferent” urban culture (Simmel 2002). It is similarly difficult to track urban residents and young people in the China Health and Nutrition Survey (CHNS), and the urban-rural gap is less significant among young people (Liang 2011). This may be because both urban and rural youth are highly mobile groups, and those who stay in boarding schools or moved to other places cannot be captured by a family-based sampling structure (Liang 2011). These studies point out various attrition problems for different socioeconomic and age groups, but they have not touched on the concrete strategies for tracking particularly mobile groups.

Regarding recent panel studies on youth in China, there has been an increasing mix of survey methods but limited attention has been paid to how respondents are tracked and retained. The China University Students Survey conducted by Tsinghua University relied on self-administered questionnaires in the first wave in 2010 (China Data Center 2012). The survey used stratified random sampling to obtain a sample of 19 universities and then further drew a random sample of students in each university who graduated in the year 2010. The Employment, Life, and Values of University Students Survey conducted by China Academy of Social Sciences used a Website survey in the first wave in 2013 (Li 2013; Li and Shi 2013). The survey used a multistage stratified random sampling strategy to sample from 12 universities. The released findings focus on the transition of youth from school to work based on the first wave, but little is known about how follow-up waves have been conducted to track university students after graduation. The Beijing College Students Panel Survey (Wu 2017) began with freshmen and junior college students and used face-to-face interviews in the first waves in 2009 and 2010. The survey adopted the online questionnaires in the following waves, and it is to be examined how tracking outcomes have been affected over time given the diversities and uncertainties of respondents’ growth trajectories.

Our study also focuses on youth in urban China but with an earlier starting point—when respondents were still in high school. The research purpose is to investigate their transition to higher education and adulthood (university destinations, employment outcomes, and family formation) and how such pathways are shaped by their family background, school experiences, and social capital. The study followed how young students moved after the college entrance exam (gaokao), an important point of diversification leading to different social and spatial motility patterns. Upon transition to university life, many young people leave their place of origin designated by the household registration system (hukou). The consequent education migration inevitably leads to the geographic dispersion of youth. Since the Maoist era university education has enabled numerous rural youth to move to cities and people from small towns to municipalities, with enhanced chances of finding jobs and settling in desirable locations. With a concentration of education institutions in cities and that of elite universities in municipalities, education migration is a ranked process in which prestigious universities attract youth from all over the country, and less-attractive institutions recruit more local students from within a shorter radius. This migration pattern was disrupted during the Cultural Revolution and then reinstated under the market reform, in parallel with increasing labor migration and persisting institutional barriers. Education migration remains an important channel of social mobility that can influence people’s subsequent migration trajectories and socioeconomic achievements. Meanwhile, some young people may enter labor markets earlier than those who continue to pursue further study. This work also aims to capture such diverse career and life-course pathways. Given the high mobility of young respondents, the tracking methods have been continuously evaluated and contextualized in the youth development process and the information technology era. The research team adopted various tracking tools to solicit cooperation across time and space and examined the attrition results over waves.


The Panel Study of Nanjing High School Graduates examines the education transition of urban high school students originally located in Nanjing and explores the factors associated with the process of youth growth and development. The research team includes researchers at Nanjing University, Hong Kong Baptist University, and the Chinese University of Hong Kong. The study was supported by several sources, particularly the Research Grants Council of Hong Kong Special Administrative Region. The survey design has been developed and modified based on field visits, virtual communication, and the continuous trial-and-error process of adopting various tracking methods. Different from Web surveys based on samples of self-selection, this study started with a multistage sampling strategy and drew a stratified sample of 11 high schools representing different school ranks in Nanjing. In each high school, a cluster sampling strategy was used to randomly select two or three Year-2 classes, and each school contributed around 80–100 respondents to the sample. As such, the sample was designed to be representative regarding high schools of various academic reputation and facilities, which differs from the online panels of opt-in volunteers (Couper and Miller 2008). Researchers’ knowledge about the sampling frame facilitated the continuous examination of the coverage and representativeness of the remaining sample while relying on a mix of ICT tools to follow respondents. In sum, this study is a multimode survey with conventional and ICT-based components, similar to a few other studies (Rao et al. 2010).

The first wave (May to June 2010) collected data using self-administered questionnaires, resulting in a sample of 1027 people with 987 responses (a response rate of 96.1%). The second wave was conducted from July to August 2011. Before that period, students were occupied with preparing for the national college entrance exam. The research team waited until the exam was over to conduct the survey. This survey resulted in 891 responses (86.8%). This was the final chance to update the contact information of respondents before they graduated from high school; afterward, most would leave home and could not be tracked via home addresses and telephone numbers. In addition, researchers could not continue to rely on the high school teachers, who had helped with the entry into the field, to conduct follow-up surveys. It was thus necessary to collect and update all the possible contact information before some respondents “disappeared.” The research team managed to survey most respondents using the self-administered questionnaires; for those who could not meet in person, researchers used email questionnaires to collect responses.

The subsequent surveys were conducted in 2012 (May to October), 2013 (June to October), and 2014 (November 2014 to January 2015), when most respondents were in their first, second, and third years at universities. The 2012 wave used a mix of email survey and Web survey methods, and the 2013 and 2014 waves relied mainly on the Web survey method. The research team sent emails to respondents and attached the questionnaire or the Web survey link respectively, and used phone calls, email, and QQ1 to track respondents if no response was received. The response rate was 76.9% (790 people), 72.9% (749 people), and 73.6% (756 people) respectively. The following waves will continue to be conducted in order to follow respondents after their graduation from universities.

Over the different waves, the survey method evolved from using paper questionnaires, to email questionnaires, to Web surveys. Email questionnaires were first used in the 2011 wave, which illustrated email’s strength in allowing immediate transmission, convenient interactions, and delayed responses (Sproull and Kiesler, 1991). However, respondents may still find it tedious to answer an email questionnaire, particularly if it is long or has complicated skipping patterns. The Web survey system greatly simplifies the process. Respondents only need to open the survey link and answer by clicking or choosing from several options. The survey website can filter respondents and direct them to different skipping patterns, which helps to save the time and energy of respondents in finishing the survey.

Although with the aid of ICT tools the survey can be distributed electronically with greater convenience and lower cost, the use of ICT tools does not guarantee that respondents are available and will cooperate in survey participation. The process of tracking respondents usually takes several months due to finding “missing” respondents, motivating more respondents to participate, and allowing respondents to find a suitable time to complete the questionnaire. Furthermore, the use of ICT tools introduces new uncertainties and challenges because it is difficult to predict when respondents will respond, how they will participate, or whether the survey request has been discarded as junk mail. The research team thus has to try a wide variety of ICT tracking methods in this process in order to determine the best way to communicate with respondents.

Results and discussion

Diversity and flexibility: tracking respondents with ICT tools

Panel studies have relied on the locating information from the original study to track respondents (Haggerty et al. 2008), but such old information is not always reliable. In the 2012 wave, one major challenge was to relocate respondents because after they entered their first year of university life, many respondents got a new cell phone, moved into a new dormitory, and began to use university email services. Given their diversified communication styles, the research team tried all possible contact information from previous waves that could be useful in tracking respondents, such as email, QQ, or phone. Researchers also got in touch with some key informants, including the high school teachers and classmates, who helped pass on messages regarding the survey and update the contact information of some respondents. As suggested by other studies, the “comprehensive location information” can be traced from the participant’s friends or relatives as well as any available records (Navratil et al. 1994).

Many respondents gave up their previous phone number or email address but often kept some tool to be connected with their old friends, such as QQ, although its role has been gradually replaced by their new email or WeChat accounts. One respondent stated, “I will keep QQ ID although I seldom talk on it now, because my old friends are all on QQ”; this view was echoed by many who referred to QQ as a backup list of their former contacts. As a less-active but still-valid communication tool, in the 2012 wave, QQ accounted for 23% of the returned questionnaires compared to 46% by email, and 2% were returned via the combination of email and QQ reminders (Table 1). In the following waves, QQ remained an important tool for real-time interaction and was used in combination with email.
Table 1

Tracking methods and responses (in percentages)





Email and QQ










(With email)






(With email)




Still, email remained to be the most effective tracking method, particularly after most email addresses were confirmed or updated in the 2012 wave. In the 2013 wave, email survey was replaced by Web survey, but email continued to be the major way to send website survey links, and it accounted for 64% of the responses retrieved in this wave. Furthermore, a combination of email and QQ communications also remained important, accounting for 22% and 24% of the responses retrieved in the 2013 and 2014 waves respectively (Table 1). The significance of phone numbers declined in 2013 and increased again in 2014, possibly because mobile phones became more prevalent among senior students. Also related to their increasing compatibility with mobile apps, mobile phones became a more-important communication tool for students who were about to enter the job market.

The diverse and flexible ways of tracking respondents suggest that the use of ICT tools must be contextualized in the developing trajectories of youth growth and youth culture. To enable convenient real-time communication with respondents, the research team set up a collective QQ ID to build a connection with respondents upon respondents’ approval; research assistants took turns being online to answer questions and provide clarifications during the survey period. The online status of respondents often suggested a higher likelihood that they were not occupied with other tasks and would have the time to deal with the survey request. For the research team, it offered the opportunity to provide timely support to respondents when they were filling out the survey questionnaires. Respondents could set their status as “online” or “invisible,” allowing them the power to indicate and limit their availability. Respondents could initiate conversations and choose to respond selectively, without the real-time pressure to answer as in phone calls. This was greatly appreciated by young people, who want to be connected but also cherish their freedom and their own space. Since conversations are bilateral, the diverse communication styles not only helped the research team reach respondents more effectively but also made researchers more available to respondents.

Cultivating connections: tangible and intangible incentives

Previous studies have used various ways of providing incentives to respondents and compensate them for their participation (Navratil et al. 1994). With an increasing reliance on ICT-based communication, it has become less feasible to send gifts or vouchers to respondents directly. Different from the conventional face-to-face interview, incentives may need to be transmitted electronically, such as electronic coupons, prepaid mobile phone cards, and credits for online games and services. This panel study provided prepaid mobile phone cards of around RMB 30–50 yuan (≈ 4.8-8.0 USD) to compensate respondents, and this amount could be deposited directly to their phone numbers. As with the strategy of sending gift vouchers together with a change-of-address card (Laurie and Scott 1999), this study also used this chance to update contact information by sending confirmation messages to their phone numbers. However, such economic incentives may not be big enough to persuade some to participate, and for others, the exchange relationship may cause discomfort since they do not like the instrumental feeling of “selling information.” In both situations, the research team worked on the emotional aspects of how respondents felt about the survey in various ways.

First, the research team maintained regular contact with respondents so that respondents remained familiar with the project, but with sufficient intervals in between. On each New Year’s Day, the research team sent respondents greeting letters together with mobile phone credits. Greetings were also sent before the end of semesters, national English tests, or other exam seasons to wish them good luck. “The amount of economic incentives was not significant,” one research assistant observed, “but they were surprised that they were remembered during holidays and exam periods.” Such efforts also sent a message of long-term commitment and perseverance. Respondents saw that the research team was “always there,” determined to follow them, and making various efforts to “keep it warm.”2

Second, the one-to-one communication was combined with group messages. Research assistants, mostly graduate students in Nanjing University, made themselves available for random questions from respondents. Some respondents asked questions about preparations for the college English tests or applications for graduate schools, and some wanted to know what graduate school study was like. With the assistance of ICT tools, such sharing helped the research team provide tangible and intangible incentives that suited the respondents’ needs. However, researchers should also make deliberate trade-offs between efforts to keep connected and avoiding “too-close” contact, and reflect on how much interaction is too much. To minimize the possible impact that one-to-one communication may have on respondents’ ideas, such communication was used only when necessary and initiated by respondents, and research assistants were reminded to utilize neutral and value-free approaches in their communications in order to minimize the social-desirability bias among respondents. Although the social-desirability bias could be a potential threat to the objectivity of survey results, such negative consequences were found to be less a problem in the Internet surveys based on anonymous responses and self-administration processes (Holbrook and Krosnick 2010).

Third, researchers adopted elements of youth language in communicating with respondents, including informal wording and ideograms (or emoji), and combined them with formal wording in seeking cooperation. The use of informal wording and emoji, including virtual facial expressions (smile, sign, sweat, and so on), could greatly soften the tone of the request and create a casual, friendly, and relaxed atmosphere for conducting the survey. When research assistants used the language of the youth to communicate, they also showed respondents that they were interacting with real human beings, not cold machines. Such expressions were popular among respondents’ peers, and with the aid of this language, “respondents were more willing to communicate with us,” according to one research assistant. She sometimes called such communication as “playing cute” (卖萌 maimeng), a common self-expression gesture among female university students. By using “cute” words and emoji, a more-egalitarian relationship was created instead of the conventional relationship between researchers and respondents. These casual and real-time interactions allowed respondents to ask questions such as “Why is it so long?” “What is the question for?” and “Which category applies to me?” Answering these questions helped clarify questionnaire-related problems and minimize confusion.

In addition to the emotional aspect of familiarity and closeness, it was also important for respondents to be aware of the significance of their participation. In each wave, the survey request was sent in a formal email from the research team represented by the survey coordinator at Nanjing University. The letter not only explained the purpose of the survey but also conveyed gratitude for respondents’ long-term participation by creating a “project identity” (Navratil et al. 1994). When the response rates were not ideal at the beginning of the wave, the research team sent another formal email to thank those who participated, stating the importance of collecting continuous waves of information and reassuring respondents of the confidentiality of the survey. At the end, the research team also sent respondents short reports about the survey. By sending respondents feedback about how the data they provided were used as other previous studies have done (Flick 1988), the research team conveyed the message that respondents who continued their participation made a real difference and that the project provided meaningful information on youth growth trajectories.

The research team was aware that the increase in interaction could introduce new biases. When respondents became more familiar with the survey instruments and when the connections were maintained over the different waves, it was necessary to compare the remaining samples with respondents who quit the survey in key dimensions. The two groups showed no significant differences in most socioeconomic characteristics, university experiences, and social activities across waves (father’s job, father’s party membership, high school rank, attending university, university rank, party and youth league membership, school activities, doing part-time jobs, currently dating). However, compared to those who quit the survey, the remaining sample had a higher proportion of females (59% compared to 50% in 2013) and a lower proportion of single children (84% compared to 90% in 2013). This may reflect a greater willingness to cooperate among female and non-single-child respondents over time after they have been followed for several waves.

As such, the efforts to keep connected work better for some respondents but not for others, and researchers have to be cautious about the potential biases that may be introduced by an increase in interaction with respondents. The concern regarding keeping the remaining sample representative has to be balanced with the goal of retaining more respondents in the sample. Furthermore, too much interaction could have other negative impacts on tracking outcomes. By using ICT tools, people can now block messages from someone or to be “invisible” from others. Their reluctance to communicate suggests a mixed meaning of trust—that it is not only built on familiarity but also related to an appropriate distance in social interactions. It was thus important for the research team to utilize existing social networks and avoid intruding into respondents’ lives directly.

Trust and privacy: personalized communication from a safe distance

To bridge the distance between researchers and respondents, it is sometimes useful to “establish formal or informal relationships” with third parties, such as public and private agencies (Navratil et al. 1994). The research team also relied on some key informants who were willing to help, usually one from each high school class in the baseline survey. They helped the “friend requests” to survive the filtering process of receivers, provided useful information to those were suspicious about the survey, and facilitated the timely circulation of survey requests. Another important way to be connected with respondents indirectly was to join the QQ groups for the high school classes that were sampled in the baseline wave. The request was sometimes rejected by the “managers” of these class groups, but the process was much easier if the invitation came from one of the insiders, often one of the key informants. This group-level communication became a very useful tool for conveying messages, even without direct one-to-one interaction. The research team could post survey reminders in the QQ group as well as notices of sending compensation. Some respondents asked questions about the survey or confirmed that they had received compensation, and these messages could be informative to others in the group as well.

However, this did not mean research assistants could consider themselves insiders of the QQ groups. As one research assistant observed, it was important to “keep a distance from their in-group conversations, act like invited guests, and speak only when someone asks… questions” because “it is their space, not mine.” Research assistants also found that, if possible, it was better to rely on the group manager or a key informant to forward the survey information. Information sent by insiders was often better received and could result in a higher level of cooperation. Having an insider serve as a mediator helped connect researchers and respondents indirectly and reduced negative feelings of disruption and intrusion.

In an era when online “cheating” is not uncommon, researchers recognized the importance of protecting the confidentiality of respondents and dealing with personal information carefully. In online conversations, research assistants mostly focused on the questions and procedures per se and would only engage in conversations when there was a request from respondents. The research team also used QQ more often than other communication tools because of this concern for privacy. The QQ account is usually shown as a virtual ID, while the renren alumni platform can reveal personal schooling information and the WeChat account is linked with its “friend circles” that contain personal data. The research team continually tried to incorporate more communication tools while being alert to their different characteristics regarding the privacy of users.

To avoid disruption, researchers relied more on the indirect methods of email and QQ communications rather than direct phone calls and limited the number of reminders to allow a sufficient time gap in between. In the 2014 wave, the research team only sent one reminder around the end of the year, combined with Christmas greetings, New Year greetings, and a good luck message for those who took tests to enter graduate schools at that time. The good luck message suggested, “You can ignore our reminder now and concentrate on your tests,” and emphasized that “we can wait for you to participate when you are not busy.” In the 2 days after sending this reminder, the response rate increased significantly, from 420 to 538 out of 756 total respondents who participated.

Although there is no standard answer about how actively and frequently researchers should contact respondents, previous studies have suggested that limited contact efforts would lead to significant participant loss (Cotter et al. 2005). However, this study suggests that too much contact may have a negative impact on tracking outcomes. The findings point to the importance of leaving some distance between researchers and respondents, and confirming that respondents’ privacy and confidentiality are respected and prioritized. Restraining contact efforts may have undermined the participation outcomes in the follow-up surveys, but this study also provides evidence that the alternative ways of indirect communication with respondents can help secure participation to some extent.


Previous studies on the tracking issues in panel studies have focused more on the quantitative measures of attrition and the characteristics of the missing and remaining respondents in the sample. With a focus on the survey outcome, the existing research examined how to add to the incentives to participate. In these surveys, youth respondents have become a unique group that represents several challenges that panel studies face in contemporary China, due to increasing social mobility and the persisting significance of social connections. To follow young people who grow up with greater uncertainties in their life and migration patterns, it requires an extraordinary amount of communication efforts compared to regular panel studies (Cotter et al. 2005). Given the unprecedented prevalence of Internet coverage and mobile devices, researchers may be empowered to easily contact respondents, but this study finds that it is important to cultivate a relationship of comfort and trust with an appropriate distance in order to retain young respondents in the sample.

This study can contribute to the existing research on panel studies in three ways. First, it adds to the quantitative measures of attrition by illustrating different efforts to minimize the attrition rate based on the qualitative understanding of the effective communication patterns. In the first wave of this study, youth respondents were relatively easy to access via the introduction from their teachers. However, upon their transition to a perceived status of mature and sophisticated adulthood, as well as their reflection on their life experiences, family environment, and school authorities, their willingness to cooperate in the following waves varied. In this process, they became a highly mobile group that cannot be easily reached by traditional communication methods. In response, the research team tried diverse ways and means to enhance the response rate. With distinct qualitative differences in tracking methods (see Table 2), researchers need to be prepared for various preferences among respondents, which may change over time. Furthermore, different ICT tools have their strengths and limits that need to be continuously evaluated, and researchers need to make deliberate trade-offs to minimize the negative impacts of using ICT tools. Researchers must reflect on how to stay connected while avoiding too much interaction and to adjust tracking strategies that may not apply well in certain contexts. For example, email or mobile phones seem to have advantages in many aspects of communication, but they may be more effective in some situations than others. If the panel study only relies on email or mobile phones, the tracking outcomes may be undermined due to uncertain use frequency (email) and possible disruption (mobile phones). Furthermore, this could lead to the selection bias toward people who use email or mobile phones more frequently and miss other groups. Although the use of ICT tools may be suitable in panel studies of mobile young people who attended universities or entered the labor market via migration, it may not be appropriate to rely too heavily on ICT tools in other kinds of panel studies. Another example is QQ groups, which provide a good platform in this study because of the cluster sampling strategy of classes. This may not apply to other panel studies because respondents are not ready for this kind of communication.
Table 2

Advantages and disadvantages of ICT tools


Possibility of change

Use frequency

To be connected




Relatively low


Possible but may be ignored



QQ (individual)


Relatively frequent

To be approved by receivers



QQ (group)


Relatively frequent

To be approved by receivers



Home phone



Possible but may be ignored



Mobile phone

Relatively high

Extremely frequent

Possible but may be ignored



Second, this study focuses more on the process rather than the outcome of the tracking processes. ICT tools provide significant help in reaching respondents, but they cannot solve the problem of how to motivate potential respondents to participate; these respondents may reject the survey invitation just because they are not in a good mood. As discussed above, ICT tools per se have pros and cons in bridging researchers and respondents, particularly for young people. Equipped with better education and more information technology, they compose a group that is eager to be in touch but also wants independence from teachers, parents, and other authorities. To smooth communication with youth respondents and win their cooperation, the use of communication tools must take into consideration the characteristics of youth culture and youth development. The tracking process is a continuing interactive process in which researchers should keep exploring how respondents feel about the survey. Researchers have to adapt to the language of communication among youth, and the research design needs to be improved to speak to the needs and characteristics of respondents. Although the biggest loss of respondents occurred at the time when they moved from high schools to universities, the research team adjusted the tracking strategy immediately and continuously and has managed to retain the majority of the remaining sample since then. As such, the remaining sample is a subset of the original respondents, and its size and representativeness are the result of the cumulative efforts that take place in different stages of the panel study (Callegaro and Disogra 2008; Disogra and Callegaro 2010).

Third, this study adds the importance of nonmaterial incentives to that of material incentives to reduce the attrition rate. Instrumental rewards may play a role in motivating survey participation but respondents may become reluctant to material incentives over time, and thus, panel studies often need to increase the amount of rewards in the later waves (Ribisl et al. 1996). It has become more important to maintain close contact, show respect, and provide nonmaterial incentives by sending greetings occasionally to “keep it warm,” giving suggestions and support regarding their university life, and “softening” the conversations in addition to relying on the reputation of survey institutions. Given the long-term cultivation of familiarity between researchers and respondents, some respondents suggested that their survey participation also provided a moment to reflect on their own growth trajectories. However, the increase in familiarity has to be combined with an appropriate social distance, based on the respect for privacy and the minimization of disruption, with the aid of existing networks and key informants. In other words, researchers cannot merely rely on the convenience and the interactive nature of ICT tools but need to deal with challenges such devices brought about for survey design and data collection (Couper and Miller 2008).

Young people have been perceived as a group that benefits most directly from the nation’s education aspirations and modernization campaign. To follow them is to follow the rapid changes in people’s lived experiences of education, occupation, social mobility, and communication technologies. Compared with other probability-based online panels that usually need to be provided Internet access (Revilla et al. 2016), this study is facilitated by the prevalence of ICT tools and the increased use of mobile Web applications among young people. These present new opportunities and challenges for data collection, especially in response to respondents’ actions (Couper and Miller 2008). Based on a continuously updated understanding of how respondents felt about being a part of the panel study, researchers tried to incorporate a human touch to deal with the distrust and noncooperation of respondents in a virtual platform. Rather than only relying on modern ICT tools and material incentives, this study rediscovered the importance of diversity, trust, and the respect for privacy, which were reinterpreted in the overloaded information era and within the ever-moving youth culture.

It may be time-consuming to explore various tracking methods in this information era when there are too many communication tools to choose from, and this cannot be well planned from the beginning of the panel studies. In other words, the adjustment of tracking methods is also a self-learning process for the research team to acquire new technologies, but more importantly, using the new ICT tools requires the human touch and needs to be contextualized in the youth development processes. Persistent trials, creative innovation, teamwork, and understanding are more important in dealing with long-standing attrition issues and are likely to lead to favorable tracking outcomes.


QQ is an instant messaging software service developed by the Chinese company Tencent. It offers both individual and group chat services; users can send requests to be connected with others as “friends” and can approve others’ requests to be “friends.”


The occasional comments from respondents were recorded in the field notes of the research team and research assistants.




The study was supported with research funds from Hong Kong Baptist University (SOSC/07-08/CERGIAS-6 and FRG2/11-12/033), the Research Grants Council of Hong Kong Special Administrative Region (HKBU245612 32-12-456), and the Research Committee at The Chinese University of Hong Kong. We wish to thank the anonymous reviewers for their insightful comments and suggestions.

Availability of data and materials

Data will be used based on the collaboration with the research team.

Authors’ contributions

All the authors contributed to the framework of the study. JS drafted the manuscript, and all authors revised and approved the final manuscript. GL led the research project as the PI and constructed the theoretical framework of the survey. XF coordinated the data collection and the data cleaning. OW merged and recoded the data set. All the authors contributed to the interpretation of the tracking processes and findings.

Competing interests

We confirm that this manuscript has not been published elsewhere and is not under consideration by another journal. The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Authors’ Affiliations

Gender Studies Programme, The Chinese University of Hong Kong, Sha Tin, China
Department of Sociology, Hong Kong Baptist University, Kowloon Tong, China
Department of Sociology, Nanjing University, Nanjing, China


  1. Boys, Annabel, John Marsden, Garry Stillwell, Kevin Hatchings, Paul Griffiths, and Michael Farrell. 2003. Minimizing respondent attrition in longitudinal research: Practical implications from a cohort study of adolescent drinking. Journal of Adolescence 26: 363–373.View ArticleGoogle Scholar
  2. Callegaro, Mario, and Charles Disogra. 2008. Computing response metrics for online panels. Public Opinion Quarterly 72 (5): 1008–1032.View ArticleGoogle Scholar
  3. China Data Center, Tsinghua University. 2012. “Daxue biyesheng: Cong jiaozi dao pingmin” (University graduates: From outstanding to ordinary people). Accessed 13 June 2017.Google Scholar
  4. Cotter, Robert B., Jeffrey D. Burke, Rolf Loeber, and Judith L. Navratil. 2002. Innovative retention methods in longitudinal research: A case study of the developmental trends study. Journal of Child and Family Studies 11: 485–498.View ArticleGoogle Scholar
  5. Cotter, Robert B., Jeffrey D. Burke, Magda Stouthamer-Loeber, and Rolf Loeber. 2005. Contacting participants for follow-up: How much effort is required to retain participants in longitudinal studies? Evaluation and Program Planning 28: 15–21.View ArticleGoogle Scholar
  6. Couper, Mick P., and Peter V. Miller. 2008. Web survey methods: Introduction. Public Opinion Quarterly 72 (5): 831–835.View ArticleGoogle Scholar
  7. Disogra, Charles, and Mario Callegaro. 2010. Computing response rates for probability-based online panels. In Proceedings of the joint statistical meeting, survey research methods section, ed. AMSTAT, 5309–5320. Alexandria: AMSTAT.Google Scholar
  8. Feng, Xiaotian. 2006. “Zhuizong yanjiu: Fangfalun yiyi ji shishi” (Panel study: The meaning and implementation of methodology). Huazhong shifan daxue xuebao (Journal of Central China Normal University) 6: 43–47.Google Scholar
  9. Flick, Susan N. 1988. Managing attrition in clinical research. Clinical Psychology Review 8: 499–515.View ArticleGoogle Scholar
  10. Haggerty, Kevin P., Charles B. Fleming, Richard F. Catalano, Renee S. Petrie, Ronald J. Rubin, and Mary H. Grassley. 2008. Ten years later: Locating and interviewing children of drug abusers. Evaluation and Program Planning 31: 1–9.View ArticleGoogle Scholar
  11. Holbrook, Allyson L., and Jon A. Krosnick. 2010. Social desirability bias in voter turnout reports. Public Opinion Quarterly 74 (1): 37–67.View ArticleGoogle Scholar
  12. Laurie, Heather, and Lynne Scott. 1999. Strategies for reducing nonresponse in a longitudinal panel survey. Journal of Official Statistics 15: 269–282.Google Scholar
  13. Li, Chunling. 2013. “Zuinan jiuye nian de daxue biyesheng jiuye zhuangkuang—jiyu 12suo gaoxiao biyesheng zhuizong diaocha” (Employment situation of university graduates in the “hardest job search year”—panel study of university graduates among 12 universities). In Shehui lanpishu (the book of China’s society), ed. Peilin Li, Guangjin Chen, and Yi Zhang, 197–214. Beijing: Social Sciences Academic Press.Google Scholar
  14. Li, Chunling, and Yunqing Shi, eds. 2013. Jingyu taidu yu shehuizhuanxing: 80hou qingnian de shehuixue yanjiu (Experience, attitudes and social transition: A sociological study of the post-80s’ generation). Beijing: Social Sciences Academic Press.Google Scholar
  15. Liang, Yucheng. 2011. “Zhuizong diaocha zhong de zhuizong chenggonglv yanjiu—Shehuizhuanxingtiaojian xia de zhuizong sunhao guilv he jianyi” (Research on the success tracking rates in panel survey: Sample attrition in the context of social transition). Shehuixue yanjiu (Sociological Studies) 6: 132–153.Google Scholar
  16. Navratil, Judith L., Stephanie M. Green, Rolf Loeber, and Benjamin B. Lahey. 1994. Minimizing subject loss in a longitudinal study of deviant behavior. Journal of Child and Family Studies 3: 89–106.View ArticleGoogle Scholar
  17. Osburg, John. 2013. Anxious wealth: Money and morality among China’s new rich. Stanford: Stanford University Press.Google Scholar
  18. Rao, Kumar, Olena Kaminska, and Allan L. McCutcheon. 2010. Recruiting probability samples for a multi-mode research panel with Internet and mail components. Public Opinion Quarterly 74 (1): 68–84.View ArticleGoogle Scholar
  19. Revilla, Melanie, Anne Cornilleau, Anne-Sophie Cousteaux, Stephane Legleye, and Pablo de Pedraza. 2016. What is the gain in a probability-based online panel of providing Internet access to sampling units who previously had no access? Social Science Computer Review 34 (4): 479–496.View ArticleGoogle Scholar
  20. Ribisl, Kurt M., Maureen A. Walton, Carol T. Mowbray, Douglas A. Luke, William S. Davidson, and Bonnie J. Bootsmiller. 1996. Minimizing participant attrition in panel studies through the use of effective retention and tracking strategies: Review and recommendations. Evaluation and Program Planning 19: 1–25.View ArticleGoogle Scholar
  21. Simmel, Georg. 2002. The metropolis and mental life. In The Blackwell City reader, ed. Gary Bridge and Sophie Watson. Oxford: Wiley-Blackwell.Google Scholar
  22. Sproull, Lee, and Sara B. Kiesler. 1991. Connection: New ways of working in the networked organization. Cambridge: The MIT Press.Google Scholar
  23. Stouthamer-Loeber, Magda, Welmoet Van Kammen, and Rolf Loeber. 1992. The nuts and bolts of implementing large-scale longitudinal studies. Violence and Victims 7: 63–78.Google Scholar
  24. Sun, Yan. 2012. “Zhongguo jiating dongtai genzong diaocha: 2010nian jixian diaocha yangben lianxi qingkuang” (China family development panel study: 2010 baseline survey sample contacting situation). Zhongguo jiating genzong diaocha xilie baogao (China family panel study reports series: CEPS-5).Google Scholar
  25. Sun, Yan, Yanhui Zou, Hua Ding, Jie Yan, Gu Jiafeng, and Zeqi Qiu. 2011. “Genzong diaocha zhong de jufang xingwei fenxi—Yi zhongguo jiating dongtai genzong diaocha weili” (Effects of respondent household and its head’s SES on panel refusal: A case study of CFPS). Shehuixue yanjiu (Sociological Studies) 2: 167–181.Google Scholar
  26. Wang, Weidong, Guihua Xie, and Lingxin Hao. 2014. Rural panel surveys in developing countries: A selective review. Economic and Political Studies 2: 151–177.View ArticleGoogle Scholar
  27. Wu, Xiaogang. 2017. Higher education, elite formation and social stratification in contemporary China: Preliminary findings from the Beijing College Students Panel Survey. Chinese Journal of Sociology 3 (1): 3–31.View ArticleGoogle Scholar
  28. Yan, Yunxiang. 2011. The individualization of the family in rural China. Boundary 2 (38): 203–229.View ArticleGoogle Scholar
  29. Zuppo, Colrain M. 2012. Defining ICT in a boundaryless world: The development of a working hierarchy. International Journal of Managing Information Technology 4: 13–22.View ArticleGoogle Scholar


© The Author(s). 2018