Department of Labor Logo United States Department of Labor
Dot gov

The .gov means it's official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you're on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Article
September 2018

Electronically mediated work: new questions in the Contingent Worker Supplement

The U.S. Bureau of Labor Statistics (BLS) added four questions to the May 2017 Contingent Worker Supplement. These questions were designed to measure an emerging type of work—electronically mediated work, defined as short jobs or tasks that workers find through websites or mobile apps that both connect them with customers and arrange payment for the tasks. After extensive review, BLS determined that these questions did not work as intended and had a large number of incorrect “yes” answers. To eliminate these false positives, BLS manually recoded the data using verbatim responses available only on the confidential microdata file. Using these recoded data, BLS estimates that electronically mediated workers accounted for 1.0 percent of total employment in May 2017. In the interest of transparency, BLS is releasing both the collected data and the recoded data. This article describes the process of developing the four questions and summarizes the evaluation of the data, the recoding of the data, estimates of electronically mediated workers, and lessons learned.

The Contingent Worker Supplement

The Contingent Worker Supplement (CWS) is a set of questions that has periodically been appended to the nation’s monthly labor force survey, the Current Population Survey (CPS).1 The CWS, first fielded in 1995, is designed to measure the number and characteristics of contingent workers and workers in four alternative employment arrangements—independent contractors, on-call workers, temporary help agency workers, and workers provided by contract firms. The survey was fielded four more times—in 1997, 1999, 2001, and 2005—with a largely unchanged questionnaire.

In 2016, the U.S. Bureau of Labor Statistics (BLS) obtained funding to field the CWS in May 2017. One major goal of the 2017 CWS was to see how the number of contingent workers and workers in alternative employment arrangements had changed since 2005. Therefore, in order to maintain data comparability over time, the 2017 questionnaire was largely the same as that used when the data were last collected.

Many stakeholders were interested in adding questions to the CWS to collect information about a variety of other topics. However, the development of new questions can be a lengthy process, and BLS had limited time to make changes if the survey was to be fielded in May 2017. First, to comply with Office of Management and Budget (OMB) guidance, all substantive changes to federal survey questionnaires are evaluated, and proposed changes are announced and provided to the public for comment, which can take considerable time. Additionally, the U.S. Census Bureau—which conducts the survey for BLS—had adopted new software for its data collection instrument since the 2005 survey, so the CWS needed to be completely reprogrammed in the new software and tested thoroughly. (The data collection instrument is the custom-designed software used by Census Bureau interviewers to conduct the survey and collect responses.) Because the existing survey had an extremely complicated questionnaire, many rounds of systematic testing would be necessary to ensure that the survey instrument was programmed correctly. Developing and adding new questions and ensuring that these questions were programmed correctly would strain an already ambitious schedule.

After consulting with the Census Bureau, BLS determined that, given the time constraints and the need to minimize respondent burden, it was not possible to add more than four straightforward questions—that is, four questions with limited skip-and-fill patterns and with limited response options.2 Also, the four questions would need to be added to the end of the CWS questionnaire so that there was no impact on responses to earlier questions. In addition, placing questions at the end of the questionnaire would simplify the programming of the data collection instrument.

Developing a draft set of questions to add to the CWS

In early 2016, BLS formed a team of economists and survey methodologists to investigate the possibility of adding four questions to the CWS. While the group considered several topics—such as second jobs, flexibility of work, advance notification of work schedule, and contingent work and alternative employment arrangements over a longer time span than the previous week—consensus coalesced around obtaining data about work arrangements that have emerged since 2005.

New terms are being used in relation to this emerging type of work, such as “gig workers” and “gig economy.” BLS does not have a definition for these terms, and there is no generally accepted definition among researchers. Many definitions of gig workers include people in temporary jobs, independent contractors, on-call workers, and day laborers—all of which can be estimated with CWS data. However, many definitions also include people in types of work arrangements that did not exist when the survey was last fielded. Many researchers and policymakers have expressed a need for additional data on emerging work arrangements to paint a more complete picture of gig workers, especially since anecdotal evidence suggests a sharp rise in the number of these workers in recent years.

The CWS seemed an appropriate survey for collecting these new data. BLS decided to focus on one emerging type of work that most researchers consider to be a type of gig work—one that is sometimes referred to as “electronically mediated work” or “online platform work.”3 In this type of employment arrangement, workers

·       use a company’s website or mobile app to connect to clients or customers and obtain short jobs, projects, or tasks;

·       are paid by or through the company that owns the website or mobile app;

·       choose when and whether to work; and

·       may do these short jobs, projects, or tasks in person or online.

There are many examples of this type of work. For instance, some people use their own cars to transport people from place to place, having obtained customers through a mobile app that facilitates payment of the ride. (Companies that currently enable this kind of work include Uber and Lyft.) Others do household chores or yardwork after finding clients through a mobile app or website that later arranges for work payment. (Examples of companies that focus on these types of short-term jobs include TaskRabbit and Handy.) Additionally, some workers do work entirely online, such as taking surveys, adding descriptive keywords to photos or documents, or designing webpages for businesses. (Companies such as Amazon Mechanical Turk and Clickworker enable this type of work.)

Note that workers are not considered electronically mediated workers simply because they use a website or mobile app to do their work. The website or app must be used to connect them directly to customers or short-term jobs or tasks, and workers must also be paid by or through the company that owns the website or app. People who find customers or jobs through online ads but are not paid by the company that owns the website where they posted the ad are not considered electronically mediated workers. For example, work found through a Craigslist.com ad is not considered electronically mediated work.

Moreover, many businesses have websites or mobile apps that their employees may use to carry out their work. For example, a company’s driver may use a mobile app to map a route when making deliveries; however, this alone does not constitute electronically mediated work. Similarly, some businesses—such as coffee shops or fast-food restaurants—allow customers to order through a website or app. These businesses typically have a dedicated staff to complete orders. Thus, a barista at a coffee shop is not considered an electronically mediated worker just because a customer may order a beverage through a mobile app.

Electronically mediated workers often have the ability to choose when and how much they work. Some people do electronically mediated work as their only source of income, while others do this type of work as a second job or “on the side.”

Though the measurement of electronically mediated work, like that of other emerging types of work, had been little researched, the BLS team began by reviewing the existing literature.4 The concepts used to define electronically mediated work are quite complicated, and many decisions had to be made in developing questions to identify this type of work.

Self-reporting versus proxy reporting

In both the CPS and CWS, one person answers the survey questions about everyone living in the household. Thus, respondents provide data about themselves (self-reports) and others living with them (proxy reports). People who report about other household members may not be able to answer all questions about others’ employment arrangements. While it would be possible to have all household members report only about themselves, this would be quite expensive because interviewers would have to contact some households multiple times in order to obtain responses from all household members. Therefore, respondents need to be able to answer all questions both about themselves and others.

Reference period

The CPS and the CWS use a “last week” reference period—that is, the week including the 12th of the month. Because the four new questions would follow the CPS and the CWS, it would make sense that they too focus on “last week.” However, while some people may do electronically mediated work as a full-time job, there is evidence that many do this type of work sporadically.5 Thus, focusing on “last week” might understate the number of people engaging in such work, as it would fail to capture those who regularly perform electronically mediated work but did not do so during the past week.

Therefore, BLS team members considered using a longer reference period, such as the past month or past year, but were concerned that this would confuse respondents because so many previous questions in both the CPS and CWS focus on “last week.” Further, a long reference period can sometimes be difficult for respondents because they may not remember when certain activities occurred. For example, if respondents are asked how many times they did a particular activity in the last year, they may include activities from 2 or 3 years ago. In addition, if a longer reference period were used for the new questions, the number of workers doing electronically mediated work would not be comparable with other CWS estimates. With the same reference week, the interaction of electronically mediated work with contingent work and alternative employment arrangements could be explored. For example, BLS could estimate the number of electronically mediated workers who were also contingent workers or independent contractors. To both avoid respondent confusion and keep measures on a comparable basis, BLS decided to use the reference period of “last week” for these new questions.

Question universe

The CWS questions are asked of employed people.6 However, some researchers have suggested that people do not consider electronically mediated work to be a job.7 If true, people who only did electronically mediated work might be undercounted if the questions were limited to those classified as employed through answers to CPS questions.

Although keeping the same universe for the new questions would be practical, the BLS team evaluated whether the question universe should be expanded to include those who were not employed—that is, either unemployed or not in the labor force. However, the most basic of the labor force questions in the CPS asks “LAST WEEK, did you do ANY work for pay?”8 People with responses of “no” to this question (and who were not temporarily absent from a job) would be classified as unemployed or not in the labor force. In order to expand the question universe to those who were not employed, the CWS would essentially have to repeat this question or a variant of it to people who had already answered “no.” This could frustrate respondents who felt that they had already answered the question. In addition, BLS research suggests that the effect of missed informal work on total employment estimates is likely to be small.9 Given these concerns, BLS decided to restrict the universe of the new questions to the employed.

Question scope

The existing questions in the CWS apply only to a person’s main job. For the relatively few people with more than one job (about 1 in 20 workers in 2017), this is the job in which they usually work the most hours. Because anecdotal evidence suggests that many people do electronically mediated work in addition to a regular job, the team investigated expanding the scope of the questions to include all reported jobs. Also, as mentioned above, some researchers have suggested that some people may not view electronically mediated work done on the side as a job, which could cause electronically mediated work to be underrepresented in measures of second jobs. One way to expand the scope of the questions would be to ask about any work done in the reference week, not just work for the main job.

BLS team members feared that respondents would be confused by a sudden shift to questions asking about any work after having answered so many questions about their main job. In addition, asking about any work, rather than the main job, would mean that data on electronically mediated work would not be on a comparable basis with the other data collected in the CWS.

Despite concerns about respondent confusion, BLS thought that it was important to expand the scope to provide more information about the relatively little-studied topic of electronically mediated work. To address concerns about comparability, information could be collected about whether electronically mediated work had been done for the main job, a second job, or additional work for pay. The additional work for pay category would hopefully capture electronically mediated work done by people who do not consider such work to be part of a job.  

Question subject

Some researchers were interested in distinguishing between electronically mediated work done entirely online and that done in person, speculating that their effects on the labor market might be different.10 In particular, in-person electronically mediated work would more likely affect local labor markets, while electronically mediated work done entirely online would more likely impact the global labor market and be influenced by international regulations and trends. In addition, some researchers suggested that in-person electronically mediated work was more likely to be a sole source of income because it may require a greater time commitment. By contrast, people might be more likely to do online electronically mediated work on an intermittent basis to supplement their incomes.11 For this reason, the BLS team decided to distinguish between these two types of work.

Question wording

There were considerable challenges to designing a set of only four questions that would be clearly understood by respondents. One of the easiest ways to ask questions about electronically mediated work would be to ask whether respondents (or members of their household) had done work through specific companies, such as Uber, Lyft, or TaskRabbit. However, BLS survey questions, by longstanding tradition, do not use specific company names because companies can change, especially in emerging industries or fields. Companies popular at the time of initial survey development may no longer exist when the survey is fielded. BLS attempts to minimize changes to questionnaires because even small changes to question wording can affect responses and, thus, data comparability over time. In addition, respondents may focus only on the company named and omit similar companies. For example, respondents may fail to respond about ride-share companies other than Uber and Lyft if only those companies’ names were included in a question.12 Given these concerns, BLS decided the questions should describe the characteristics of the work itself but not use company names.

BLS knew it would be difficult to design four questions about electronically mediated work that would be clear to respondents without using company names. Respondents might interpret questions about finding jobs through websites or mobile apps as questions about online job search. In addition, use of websites and mobile apps is widespread, and they are used for many different reasons. Writing questions so that respondents could clearly identify when they had used websites or apps only to facilitate electronically mediated work would be a challenge.

Difficult concepts can often be clarified by including examples in questions. The risk of using examples is that respondents may focus on the example rather than on the actual question, which may lead to incorrect answers if a respondent’s experience does not align with the example chosen. For instance, a respondent who did electronically mediated chores might answer “no” to a question that included an example about electronically mediated ride sharing. Because of this danger, BLS is cautious about including examples in questions. However, BLS believed that the advantages of using examples would outweigh the disadvantages as long as the examples were chosen carefully.

Draft questions

After much discussion of the previously mentioned topics, the team proposed the wording of the four questions. One question asked about in-person electronically mediated work. Another asked about online electronically mediated work. Both used examples to clarify the concepts. The in-person and online questions were each followed by a question about which job this work was done for—that is, whether the work was for their main job, a second job, or additional work for pay.

Stakeholder outreach

Throughout the question development process, BLS actively sought feedback about the proposed new questions. BLS staff gave many presentations and briefings about the CWS to outside groups, including congressional staff, industry groups, academics, nonprofit organizations, and other government agencies. While these presentations tended to focus on the CWS as a whole, BLS efforts to add new questions to collect more information were also described.

In addition, an early draft of the new questions was circulated to many academics, industry experts, special interest groups, and other data users. The draft questions were also discussed with the Department of Labor’s Structure of Work Policy Working Group, which had emphasized the need for up-to-date data that could be used to study how Americans’ work arrangements have changed over time. Furthermore, the new questions were reviewed and cleared by OMB. The clearance process included two periods of public comment, during which BLS received suggestions from the public.

Through these outreach efforts, BLS received considerable feedback, all of which was evaluated. BLS staff made several wording changes to the questions based on specific suggestions received. Some suggestions were not feasible given the tight timeline, such as overhauling the CWS questionnaire or developing an alternative set of four questions on a different topic. Likewise, expanding the scope of the questions to cover a longer timeframe was not deemed practical.

Cognitive testing

In accordance with OMB guidelines for statistical surveys, BLS typically cognitively tests proposed new questions before they are added to surveys.13 Cognitive testing involves administering a sample questionnaire to recruited participants and then asking a series of debriefing questions.14 These debriefing questions collect information about the response process, providing insight into whether participants understand the questions as intended, have difficulty formulating their answers, and respond “correctly” given the measurement objectives. This type of testing can be valuable in ensuring that questions measure the intended concepts.

Two cognitive testing methods were used to evaluate the electronically mediated work questions—laboratory testing and online testing. For both the laboratory and online modes, the goals of the cognitive testing were as follows:

·       To ensure that the proposed questions worked as intended—that is, that they maximized the number of true positives and minimized the number of false positives

·       To test the wording of the draft questions

·       To determine if introductory or transition language was necessary between the existing CWS questions and the electronically mediated work questions

·       To determine whether interviewer instructions or help screens were necessary to explain the key concepts

Laboratory testing

BLS staff conducted 24 interviews in their Washington, DC, cognitive testing laboratory. Participants were recruited through advertisements on Craigslist.com and through flyers handed out at a DC taxi stand and a pizza restaurant. The ads targeted workers who were employed by specific companies, such as Uber, Lyft, TaskRabbit, or GrubHub, or in specific professions. The professions selected include a relatively large number of both electronically mediated workers and traditional workers in the same occupation (for example, Uber drivers and taxi drivers). People who responded to the advertisements were asked several screening questions to ensure they had relevant experience before being invited to participate in the cognitive testing.

A trained cognitive interviewer administered an abbreviated version of the CPS and the CWS, along with the four new questions. The interviewer then debriefed participants to gain insight into their response process in order to uncover any sources of error in what was reported and ways to improve the questions.

Online testing

BLS also conducted 138 online interviews through the Amazon Mechanical Turk (mTurk) platform. While online interviews differ from how the CPS is conducted—that is, in person or by telephone—they allowed BLS to recruit participants in a broad variety of professions and outside the DC area. Also, since mTurk is itself an example of a platform that facilitates online electronically mediated work, online interviews allowed BLS to recruit a large number of individuals for whom the online question would be relevant.

Results for the in-person electronically mediated work question

The in-person question asked about short, in-person jobs or tasks that people find through companies that connect them with customers through a website or mobile app and also coordinate payment for the service. The in-person question performed differently in the two cognitive testing modes. The cognitive interviews conducted in the laboratory contained some false positive responses. Through the debriefing questions, BLS survey methodologists determined that 4 (out of 14) participants who said they had done in-person electronically mediated work had not actually done so. Instead, they had obtained clients through websites (such as Craigslist.com) but were not paid through those websites. Additionally, two of the three proxy responses of “yes” to the in-person question were found to have similar errors. However, most responses to the in-person question were correct, and participants seemed to understand the question as intended.

The in-person question performed better in the laboratory testing than it did in the online mTurk testing. During the mTurk testing, there were 18 (out of 57) false positive “yes” responses and 13 (out of 81) false negative “no” responses. The false positives were determined by evaluating open-ended text descriptions of jobs. Seven false positives were due to participants identifying mTurk tasks performed in the previous week—which were done entirely online—as in-person electronically mediated work. Other false positives were made by respondents who obtained clients through a website but were not paid through that site. The false negative determinations were made by having participants select from a list of electronically mediated work platforms through which they had worked during the previous week.

Results for the online electronically mediated work question

The online question asked about short, paid tasks done entirely online that people find through companies that maintain online lists of tasks. Very few people responded “yes” to this question in the cognitive testing interviews conducted in the laboratory. (Because the mTurk testing was planned, and mTurk is a platform through which people do electronically mediated work entirely online, BLS focused its efforts on recruiting cognitive test participants for the laboratory who were likely to have done in-person electronically mediated work.) All three “yes” responses to this question collected in the laboratory were determined through the debriefing to be false positives. These participants said “yes” either because they (or their household members) found clients online or because they did some of their work online.

The mTurk testing yielded mixed results for the online question. There were very few false positives but many false negatives (31 out of 42 “no” responses). Many participants who answered “no” did not include mTurk tasks they had done in the previous week. Most of these participants did not think the online question was intended to include mTurk tasks. This could be a result of administering the testing via mTurk; participants may have excluded their mTurk work because BLS knew they were on mTurk.

Results of the “which job” questions

Both the in-person and online questions were followed by a “which job” question; if respondents said “yes” to either the in-person or online question, they were asked if that work had been done for the main job, a second job, or additional work for pay. In the laboratory, some participants found it difficult to distinguish between a second job and additional work for pay, but the cognitive testing did not probe specifically about participants’ answers to these two questions. Interviewers did probe when participants displayed obvious difficulty with either of the questions. Some participants found it difficult to distinguish between a second job and additional work for pay because they did not think of electronically mediated work as a job. In the mTurk testing, most participants said they did electronically mediated work—particularly online work—as additional work for pay.

In the CPS, main job and second job concepts are communicated through the survey questions. However, the truncated version of the CPS interview given during the cognitive testing asked only about the main job and did not include any questions about the second job. Therefore, BLS believed that some of the confusion that occurred during testing would not occur in an actual field interview. Similarly, BLS thought that CWS respondents would understand the difference between main and second jobs if they had been administered the full CPS interview.

Issues identified and final recommendations

Although there were some participants who provided incorrect responses, both types of testing indicated that the four questions generally measured what they were intended to measure. BLS survey methodologists analyzed all participant interviews to determine why incorrect answers had occurred. They identified several issues:

In-person and online questions

·       Some participants thought websites that advertised goods and services but did not facilitate payment, such as Craigslist.com, were applicable to both the in-person and online questions.

In-person question

·       In both testing modes, participants who relied on the internet or mobile apps for their work thought the in-person question applied to them. Specifically, participants who found clients through social media and participants who worked for businesses that allow customers to place their orders through mobile apps or websites thought the in-person question applied to their situation. They appeared to miss the reference to “in person.”

·       Many participants in the mTurk testing reported online electronically mediated work (in particular, tasks done through mTurk) as part of their answer to the in-person question.

Online question

·       Several participants with data entry jobs at traditional companies believed that the online question applied to them.

·       Many mTurk participants did not include their experience with mTurk as part of the online question. This may be because they were tested using mTurk and assumed that mTurk tasks should be excluded.

“Which job” questions

·       Participants had some difficulty differentiating between second job and additional work for pay.

The final report on the cognitive testing made several recommendations designed to improve the questions.15 To stress the difference between in-person and online work, the report made two suggestions: (1) to add introductory, clarifying language and (2) to emphasize the words “in person” and “online” in the questions. The report also suggested revising the examples to better represent the type of work being asked about. In addition, the report suggested highlighting that BLS was interested in learning about all work, not just the main job.

The question wording was finalized based on these recommendations. However, because of time and funding constraints, BLS adopted the revised questions without additional cognitive testing.

Final question wording

After making changes based on the cognitive testing results and stakeholder comments, BLS finalized the question wording in July 2016. Before being asked the questions about electronically mediated work, respondents were given a short introduction:

I now have a few questions related to how the internet and mobile apps have led to new types of work arrangements. I will ask first about tasks that are done in person and then about tasks that are done entirely online.

This introduction was intended to alert respondents to the fact that the following questions would touch on the internet and mobile apps. It also aimed to signal respondents to distinguish between in-person work and work done entirely online. The hope was that this introduction would clarify what might otherwise appear to be repetitive language.

Final wording of the in-person question and follow-up “which job” question was as follows:

Q1       Some people find short, IN-PERSON tasks or jobs through companies that connect them directly with customers using a website or mobile app. These companies also coordinate payment for the service through the app or website.

For example, using your own car to drive people from one place to another, delivering something, or doing someone’s household tasks or errands.

Does this describe ANY work (you/NAME) did LAST WEEK?

Q1a    Was that for (your/NAME’s) (job/(main job, (your/NAME’s) second job)) or (other) additional work for pay?

Note that names are used if the question is asked about others in the household. If respondents answer “yes” to the in-person question (Q1), they are asked the follow-up “which job” question (Q1a). People with only one job are asked whether the in-person electronically mediated work was for their job or additional work for pay. Multiple jobholders are asked whether this work was for their main job, a second job, or other additional work for pay.

As recommended in the cognitive testing report, the words “in person” were capitalized in the question. Interviewers are instructed that capitalized words are important in questions and must be emphasized when conducting interviews. To reduce the underreporting of paid activities that participants think of as “not a job,” respondents were asked to describe any work they did during the reference period. It should be noted that the basic CPS questions inquiring about work already include the emphasis on any work, so respondents would have heard this emphasis in prior questions. The words “last week” are also emphasized, which is done in other questions throughout the CWS and the CPS that refer to the reference week.

The questions about online electronically mediated work and about which job were very similar to the questions for in-person work, though with emphasis on the word “online” and with different examples:

Q2       Some people select short, ONLINE tasks or projects through companies that maintain lists that are accessed through an app or a website. These tasks are done entirely online, and the companies coordinate payment for the work.

For example, data entry, translating text, web or software development, or graphic design.

Does this describe ANY work (you/NAME) did LAST WEEK?

Q2a   Was that for (your/NAME’s) (job/(main job, (your/NAME’s) second job)) or (other) additional work for pay?

Collecting and processing the data

As mentioned earlier, the software used to program the data collection instrument—that is, the custom-designed software used by Census Bureau interviewers to collect the data—had changed since the CWS was last collected. Because of the change, Census Bureau staff reprogrammed the instrument for the 2017 CWS, adding the four new questions to the end. Staff at both the Census Bureau and BLS performed many rounds of extensive instrument testing to ensure that CWS questions appeared on the screen as expected and that all skip-and-fill patterns were correct. In addition, the Census Bureau tested the processing system before fielding.

It is cost prohibitive to do in-person training for CPS supplements like the CWS because interviewers are based all over the country. Instead, interviewers are typically trained about supplements through 1-hour self-study materials. BLS updated and augmented the 2005 training materials to include information about the new questions on electronically mediated work. The final May 2017 self-study materials covered not only the new questions but all questions on the CWS, which collects data about a number of different topics. Reflecting the order of the questions in the survey, the information about the new questions appeared at the very end of the self-study. The training materials were provided before the fielding of the CWS, and interviewers were instructed to complete the self-study materials as part of their preparations for the month.

The CWS was fielded in May 2017. No major problems with either the existing questions or the new questions were reported by interviewers during the data collection period. Considerable time was needed to process the data. Just as the data collection instrument had to be reprogrammed, all edits had to be completely reprogrammed. In addition, supplement weights needed to be developed.

Evaluating the data: monitoring interviews

Interviews conducted by telephone from one of the Census Bureau’s three data collection centers are taped for quality assurance purposes and are retained for a short period. It is standard practice for BLS staff to monitor a handful of interviews after new CPS questions are fielded. Monitoring allows staff to hear the entire interview, including apparent respondent confusion, requests for clarification, and verbatim responses to the questions. From listening to interviews, it is often possible to determine whether the questions were easily understood by respondents, whether answers were correct, and whether breakdowns in communication occurred.

While the Census Bureau was processing the data, the BLS team monitored many interviews to assess the data quality of the new questions. To enable a qualitative analysis of how the questions worked, the team used the unprocessed data to select cases with a variety of characteristics—such as occupation, self-response versus proxy response, and multiple-jobholding status. The selected cases included both those in which respondents said “yes” to at least one of the new CWS questions and those in which respondents answered “no” to both questions. Three or four team members attended each of several monitoring sessions and recorded their observations. Team members independently noted interactions between respondents and interviewers based on predetermined guidelines and assessed the correctness of answers to the new questions. The group discussed each interview immediately after listening to it, and team members were almost always in complete agreement about their assessments of cases.

In all, the CWS team monitored about 100 interviews. It was clear that there were many false positives to both the in-person and online electronically mediated questions. Respondents had described the main job earlier in the interview, and they often mentioned additional details about the work when answering the in-person and online questions. For most “yes” responses, it was obvious that the reported work could not have been obtained through a website or app that also coordinated payment of the work. Staff monitoring interviews observed some common patterns.

Many respondents focused on the examples

Many respondents focused on the examples rather than on the definitions of electronically mediated work given in the questions. Additionally, if respondents hesitated, interviewers sometimes repeated only the examples. Consequently, many said “yes” to the question if any of their job duties resembled any of the examples included in the questions. For example, monitors heard the following responses:

·       “Yes, I drive my car to work.”

·       “Yes, I sometimes use a computer at work.”

·       “Yes, that describes part of what I do at work.”

·       “Yes, I’m a graphic designer.”

Also, many respondents who said they did in-person electronically mediated work for their main job also said they did online electronically mediated work for that same job. It is highly unlikely that people did both electronically mediated work in person and entirely online for the same job.16

Many respondents said “yes” if they used websites or mobile apps in their work

Some respondents with traditional jobs used websites or mobile apps in their work. Some of these websites and apps did not facilitate electronically mediated work, but respondents gave affirmative answers to the questions anyway. Many answered “yes” if they obtained clients or jobs using a website or mobile app even if they were not paid through that website or app. Examples of respondents in this type of situation include the following:

·       A real estate agent who obtained customers through the web

·       A gravel delivery person who used an app to obtain route directions

·       A fast-food worker who prepared orders that customers placed through an app

Many respondents said “yes” to the questions if they used a computer for work

Some respondents appeared to think the questions were asking about whether they used a computer in their work. A number of respondents said “yes” to the questions and listed as examples work that was clearly not electronically mediated. Examples of respondents in this type of situation include the following:

·       A university lecturer who did all work online (lectures, student interactions, etc.)

·       A technical support person who was connected to people to help through the internet

·       A receptionist in a doctor’s office who scheduled appointments using a computer

Many interviewers did not seem to understand the goal of the questions

By asking unscripted probes, the interviewer can help respondents determine the response option that best fits. However, BLS staff rarely observed interviewers probing when necessary for correct answers or providing explanations to confused respondents. Instead, many interviewers simply repeated the examples in the questions. In addition, interviewers sometimes could not interpret the respondents’ answers. In a few cases, interviewers intervened to change previously correct answers, saying things such as “but you do use a computer, don’t you?” for the in-person question. In response to these types of inquiries from interviewers, respondents’ correct “no” answers were occasionally converted to incorrect “yes” answers.

Conclusions from the monitoring

The team observed many false “yes” answers to both the in-person and online questions. In general, both questions appear to have been too complicated. In order for a “yes” answer to be accurate, several conditions needed to be true. Many respondents did not seem to consider all of the necessary conditions and instead responded “yes” when only one of the conditions was true. While the team concluded that both questions had a high number of false positive responses, they observed no false negatives.

Evaluating the data: microdata review

The CWS team then turned to examining records on the confidential microdata file. While this file does not contain as much information about each case as a taped interview, it does include answers for other questions in the CPS and the CWS, including respondents’ verbatim descriptions of job duties, employer name, industry, and occupation. In addition, the file contains information about usual work hours; whether the person worked for the government, a for-profit firm, or a nonprofit firm; and self-employment status. The file also contains CWS information about whether people were independent contractors or in other alternative employment arrangements on their main job.

The electronically mediated work questions were asked about more than 46,000 people, and there were relatively few “yes” responses—about 1,600 for the in-person question, the online question, or both. Most of these answers indicated that the work was done for a person’s main job, and BLS could obtain information about those jobs using the confidential microdata file. A quick review reinforced what had been observed in the monitoring—that many of these “yes” answers were clearly false positives. For example, the file showed that “yes” answers for the in-person question had been recorded for the following main jobs:

·       Vice president of a major bank

·       Manager of a fast-food restaurant

·       Local police officer

·       Surgeon at a large hospital

For the online question, “yes” answers were often given for people who used computers or mobile apps in their work, even though not all of them had done electronically mediated work. Many people with “yes” answers, though not all, clearly could not have done all of their work entirely online. Examples of cases with likely false positives for the online question include the following (again, these are people who said they did this work for their main job):

·       Medical assistant administering medication to patients

·       Hair stylist

·       Railroad engineer

·       Front desk clerk at a motel

BLS also examined records with “no” responses for the electronically mediated questions. A quick review reinforced the conclusions from the monitoring—that is, the vast majority of negative answers for both the in-person and online questions appeared to be correct.

Given that both the monitoring and evaluation revealed that the questions had not worked as intended, BLS considered whether the data were too flawed to release. Although there were many false positives, false negatives did not seem to be a problem. Therefore, the team decided to use information on the confidential microdata file to see whether incorrectly coded cases could be identified using the verbatim information on the confidential microdata file.

The team devised a test to determine whether false positives could be identified, first creating guidelines to help identify whether electronically mediated work had been done. For example, respondents who worked for the federal, state, or local government were unlikely to be electronically mediated workers. The team agreed that unclear cases should be assumed to be correct. (See appendix A for a complete list of the guidelines.)

Using the guidelines the team had developed, a group of 5 staff members evaluated 100 records with “yes” answers to the in-person question. Key information about each case was read aloud, and each of the five team members independently evaluated whether the respondents’ answers were compatible with electronically mediated work, assigning answers of “yes,” “no,” or “maybe” for each case. Team members’ determinations were not discussed during the evaluation session. After all 100 cases had been evaluated, team members’ responses were compared. For a substantial number of records, the team members had unanimously agreed that the “yes” answer was incorrect. They also agreed unanimously that a few cases definitely had correct answers. The test confirmed to the team that many false positives could be identified through a recoding process.

Recoding the data

Based on the results of the recoding test, the team decided to evaluate all records with affirmative answers to the in-person and online questions and recode erroneous answers when possible. Information is collected for both main and second jobs in the CPS, so any evaluation of answers needed to consider the job for which the electronically mediated work was done. The team devised three approaches for reviewing data that depended on respondents’ answers to the “which job” questions—that is, work done for the main job, a second job, or additional work for pay. The team also reviewed a sample of “no” answers to check for false negatives. 

Electronically mediated work for the main job

The vast majority of respondents who said “yes” to either the in-person or online questions reported that the electronically mediated work had been done for their main job (or their household members’ main job). Of the 912 “yes” answers for the in-person question, 826 (91 percent) were for the main job. Of the 963 “yes” answers for the online question, 917 (95 percent) were for the main job.

The CWS team reviewed each record with a “yes” answer to the in-person or online question and an answer of “main job” to the corresponding “which job” question. The review was done in a systematic fashion by groups of five team members, and the in-person and online questions were evaluated separately. As with the recoding test, key information about each case was read aloud, and each of the five team members independently evaluated whether they thought the respondent had done electronically mediated work, assigning answers of “yes,” “no,” or “maybe” for each case. Team members’ determinations were not discussed during the evaluation sessions.

Cases with four “no” answers and one “maybe” answer were assumed to be false positives, as were cases that had unanimously been assigned “no” by all five team members. Once this review was completed, the number of records with “yes” answers for the in-person question had been reduced from 826 on the main job to 184, and the number of “yes” answers to the online question had been reduced from 917 on the main job to 167. Team members believed that, while they had identified many false positives, there were likely additional false positives that could not be identified given the available data.

Electronically mediated work for the second job

Almost all respondents who answered “yes” to either the in-person or online questions said that the work was done for the main job. However, a small number of people said this work was done for the second job. Five percent of “yes” responses for the in-person question and 2 percent for the online question were for a second job.

Because of the survey design, the CPS has less information on second jobs than it does for main jobs. Each month, information about job duties, employer name, occupation, and industry for the second job is collected of only about one-fourth of multiple jobholders.17 For the records with detailed information about second jobs, the team did an evaluation similar to that done for the main job. The team evaluated respondents’ verbatim descriptions and other information to determine whether a “yes” answer for the electronically mediated work questions should have been coded as a “no.” For the three-fourths of records for which no additional information was available, the response provided was accepted without recoding.

As with the exercise done for main job, the team evaluated answers for the in-person and online questions independently and identified a small number of false positives. Because so few records were evaluated, the team was not able to conclude whether “yes” answers were more likely to be correct for the second job than for the main job. After recoding the records for which information was available, the affirmative answers for the in-person question decreased from 50 to 48 and the affirmative answers for the online question decreased from 24 to 19. It is likely that, had information about second jobs been available for the other three-fourths of multiple jobholders, more “yes” answers would have been recoded to “no.”

Electronically mediated work as additional work for pay

A small number of respondents reported that they or their household members did electronically mediated work for “additional work for pay”—4 percent of the “yes” responses for in-person work and 2 percent for online work. The confidential microdata file does not contain any information about what respondents did for additional work for pay. Because the CWS team had no additional information about these respondents, their answers were accepted and were not reviewed. In addition, answers were assumed to be correct for the very small number of respondents who said that they or their household members did electronically mediated work but did not answer the “which job” questions.

False negatives

Although the monitoring suggested there was not a problem with incorrect “no” answers, BLS used the microdata to look for false negatives in two ways. First, staff looked at cases with “no” answers in occupations in which anecdotal evidence suggests there may be high numbers of electronically mediated workers. Staff members saw no evidence of a substantial problem with false negatives in these occupations.

Second, BLS identified records containing selected keywords. Keywords included businesses that commonly facilitate electronically mediated work, such as Uber, Lyft, TaskRabbit, Handy, Amazon mTurk, and Crowdflower. Words associated with electronically mediated work, such as taxi, freelance, and ride share, were also included.18 Using the verbatim descriptions of job duties, employer name, occupation, and industry, a team of five staff members evaluated each case to determine whether an incorrect answer of “no” had been recorded for the in-person and online questions. Team members identified a handful of incorrect “no” answers—9 records out of about 175 cases with the selected keywords. (Many of the correct “no” answers were taxi drivers and handymen who clearly had not done electronically mediated work.) Note that this was not a random sample. Rather, these records were chosen as being the most likely to have false negatives among all those with “no” answers. Because the number of false negatives identified was so small, the team concluded that false negatives were of little concern overall.

Comparing collected data with recoded data

Final results of the recoding, taking into account both the false positives and the handful of false negatives found by the team, served to lower the number of observations with “yes” answers for the in-person electronically mediated work question from 912 to 277. Recoding lowered the number of “yes” answers to the online question from 963 to 208. For both the in-person and online questions, most of the answers that were changed were for workers who had done electronically mediated work for their main job. This was partly because most cases were for the main job, and partly because BLS had information about virtually all main jobs. False positives for cases in which the electronically mediated work was done for the second job were less common, but the BLS team could only evaluate about one-fourth of those cases because of the lack of information about second jobs on the microdata file. Lacking any information about what was done as additional work for pay, the BLS team could not recode any additional-work-for-pay cases.

The recoding of in-person and online electronically mediated work was done independently. There was no attempt to ensure that “yes” answers did not occur for both questions, even though the BLS team agreed that someone was highly unlikely to do electronically mediated work both in-person and entirely online for the same job. Despite this fact, the number of cases with “yes” answers for both in-person and online electronically mediated work was reduced sharply—from 293 in the collected data to 23 in the recoded data.

Weighted estimates showed that the broad demographic characteristics of electronically mediated workers were similar for both the collected and recoded data. However, there were a number of differences by industry. (See table 1.)

Table 1. Impact of BLS data recoding process on estimates of electronically mediated work, percent distribution, May 2017
CharacteristicRecodedCollected
TotalIn personOnlineTotalIn personOnline

Number of workers (in thousands)

1,6099907015,0573,0212,969

Percent of total employed

1.00.60.53.32.01.9

Class of worker(1)

Total

100.0100.0100.0100.0100.0100.0

Wage and salary workers

62.862.462.678.172.781.9

Private industries

59.158.358.966.763.669.4

Government

3.84.23.811.49.212.5

Self-employed workers

37.237.637.421.927.318.1

Self-employed workers, incorporated

7.37.18.87.38.86.6

Self-employed workers, unincorporated

29.830.528.614.618.511.5

Industry(1)

Total

100.0100.0100.0100.0100.0100.0

Agriculture and related industries

0.00.10.00.20.20.3

Mining, quarrying, and oil and gas extraction

0.00.00.00.20.00.3

Construction

1.21.30.94.45.23.9

Manufacturing

1.10.51.96.04.66.5

Wholesale trade

0.90.90.82.02.22.1

Retail trade

5.95.67.19.910.98.8

Transportation and utilities

21.835.01.99.813.95.1

Information

4.11.47.53.01.74.2

Financial activities

3.32.54.09.59.69.6

Professional and business services

31.016.451.219.916.424.5

Education and health services

16.319.212.918.416.919.8

Leisure and hospitality

6.46.57.16.37.15.0

Other services

7.210.04.25.97.54.8

Public administration

0.60.60.64.43.75.0
Notes:

(1)Refers to the sole or main job; electronically mediated work may be done for the main job, a second job, or additional work for pay.

Notes: Some people did electronically mediated work both in person and online. An Excel version of this table is available at https://www.bls.gov/cps/electronically-mediated-employment.htm.

Source: Contingent Worker Supplement to the Current Population Survey, U.S. Bureau of Labor Statistics.

Most notably, 22 percent of electronically mediated workers in the recoded data were in the transportation and utilities industry on their main job, over twice the share of transportation and utilities workers found in the collected data (10 percent). Reflecting the relatively large share of electronically mediated workers who were ride-share drivers, the difference was particularly great for electronically mediated workers who did their jobs in person—35 percent as recoded and 14 percent as collected. In addition, the share of electronically mediated workers in professional and business services was higher for the recoded data (31 percent) than for the collected data (20 percent). Many technical jobs in the industry are sometimes electronically mediated, such as graphic design, copy editing, and computer programming. Among electronically mediated workers who did their work entirely online, the recoded data share was about double that of the collected data—51 percent versus 25 percent.

By class of worker—that is, whether people were wage and salary workers or self-employed—the characteristics are somewhat different for the recoded and collected data. In the recoded data, 4 percent of electronically mediated workers were employed in government on their main job, compared with 11 percent in the collected data. (Although people are unlikely to do electronically mediated work for a government job, some workers employed by the government on their main job did electronically mediated work for a second job or for additional work for pay. In addition, data were not recoded for a small number of cases because there was insufficient verbatim information on the confidential microdata file.) Moreover, the share of electronically mediated workers who were self-employed workers with unincorporated businesses was 30 percent in the recoded data, twice the share as in the collected data (15 percent). (Detailed estimates showing the impact of recoding on in-person and online electronically mediated work are available in appendix B.)

For both the collected and recoded data, table 2 shows the numbers and percentages of in-person and online electronically mediated workers who did this work for their main job, a second job, or additional work for pay. The share who did in-person electronically mediated work for their main job was 91 percent in the collected data, higher than the 72 percent found in the recoded data. The difference was similar for online workers—94 percent in the collected data and 78 percent in the recoded data. However, this difference reflects the fact that BLS had more information about main jobs than about other jobs or additional work for pay, and BLS could recode many cases in which work was done for the main job. As mentioned earlier, the confidential file contains information for second jobs for only about one-fourth of multiple jobholders, and contains no information about the work done for additional work for pay. Consequently, BLS could recode very few cases in which work was done for the second job. The data reflect this, showing little difference in the number of people who did electronically mediated work for their second job in the collected and recoded data. None of the cases reporting additional work for pay were recoded. Thus, there are likely to be more false positives in the recoded data among those who did electronically mediated work for a second job or as additional work for pay. While BLS is confident in estimates of the number of people who did electronically mediated work for their main job, the number of people who did this work for a second job or for additional work for pay may be overstated. Because the team could not recode as many second-job cases or any additional-work-for-pay cases, percent distributions from the “which job” questions should be viewed with caution.

Table 2. Impact of BLS data recoding process on “which job” questions for electronically mediated work, in thousands, May 2017
Which jobRecodedCollected
In personOnlineIn personOnline

Total

9907013,0212,969

Main job

7175442,7462,799

Second job

1426714380

Additional work for pay

1208512085

Percent distribution

Total

100.0100.0100.0100.0

Main job

72.477.690.994.3

Second job

14.39.54.72.7

Additional work for pay

12.112.14.02.9

Notes: BLS does not recommend using data from the “which job” questions as there was little or no information to recode people who did electronically mediated work for a second job or for additional work for pay. In particular, percent distributions of in-person and online electronically mediated work done for the main job, second job, or additional work for pay are likely to be misleading. Totals include a small number who did not answer the “which job” questions. An Excel version of this table is available at https://www.bls.gov/cps/electronically-mediated-employment.htm.

Source: Contingent Worker Supplement to the Current Population Survey, U.S. Bureau of Labor Statistics.

Conclusions from the recoding

BLS is confident that the recoded data provides a better picture of the number and characteristics of in-person and online electronically mediated workers than does the collected data. While the confidential file provided additional detail about jobs and was used to identify clear false positives, team members did not recode ambiguous cases. In addition, there was little or no information to recode people who did electronically mediated work for a second job or for additional work for pay, but relatively few responses were in these categories. Thus, while there are likely still some false positives in the recoded data, BLS believes that measures using recoded data more accurately represent the number and characteristics of electronically mediated workers than do measures using the collected data.

However, BLS does not recommend using data from the “which job” questions. In particular, percent distributions of in-person and online electronically mediated work done for the main job, second job, or additional work for pay are likely to be misleading.

Characteristics of the employed who did electronically mediated work

All estimates in this section are based on recoded data. BLS believes these data to be superior because they exclude the obvious false positives in the collected data. However, because the questions did not work as intended and there was not enough information to recode all cases, the recoded data may still have limitations. In addition, some of these estimates are based on relatively few observations, so variances may be large.

In May 2017, there were 1.6 million electronically mediated workers, accounting for 1.0 percent of total employment. (See table 3.) These workers obtained short jobs or tasks through websites or mobile apps that both connected them with customers and facilitated payment for the tasks. The estimates include all people who did electronically mediated work, whether for their main job, a second job, or additional work for pay. Of all workers, 0.6 percent did electronically mediated work in person and 0.5 percent did electronically mediated work entirely online. Note that some people did electronically mediated work both in person and entirely online. This can occur when people do electronically mediated work for two different jobs.

Table 3. Electronically mediated workers by selected characteristics, in thousands, May 2017
CharacteristicTotal employedElectronically mediated workers
TotalIn personOnlinePercent of total employed
TotalIn personOnline

Total, 16 years and over

153,3311,6099907011.00.60.5

Men

81,5458705343701.10.70.5

Women

71,7857394563311.00.60.5

Age

16 to 24

19,054166731100.90.40.6

25 to 54

98,8011,1467184881.20.70.5

25 to 34

33,9914012391841.20.70.5

35 to 44

32,0653552231461.10.70.5

45 to 54

32,7453902571571.20.80.5

55 and over

35,4762971991040.80.60.3

55 to 64

26,236219150750.80.60.3

65 and over

9,2407749290.80.50.3

Race and Hispanic or Latino ethnicity

White

120,6381,2006925891.00.60.5

Black or African American

18,588276228481.51.20.3

Asian

9,1109345491.00.50.5

Hispanic or Latino ethnicity

25,525265183941.00.70.4

Usual full- and part-time status(1)

Full-time workers

125,2401,1656875480.90.50.4

Part-time workers

28,0914443031541.61.10.5

Class of worker(2)

Wage and salary workers

138,1831,0116184390.70.40.3

Private industries

116,3009505774130.80.50.4

Government

21,8846141270.30.20.1

Self-employed workers

15,1475983722623.92.51.7

Self-employed workers, incorporated

5,57511870612.11.31.1

Self-employed workers, unincorporated

9,5724803022015.03.22.1

Industry(2)

Agriculture and related industries

2,4981100.00.00.0

Mining, quarrying, and oil and gas extraction

7750000.00.00.0

Construction

10,484201360.20.10.1

Manufacturing

15,984185130.10.00.1

Wholesale trade

3,38315960.40.30.2

Retail trade

16,1319656500.60.30.3

Transportation and utilities

7,773351346134.54.50.2

Information

2,8946614532.30.51.8

Financial activities

10,6405225280.50.20.3

Professional and business services

18,5284991623592.70.91.9

Education and health services

35,384262190900.70.50.3

Leisure and hospitality

14,24410464500.70.50.3

Other services

7,51711599301.51.30.4

Public administration

7,09510640.10.10.1

Occupation(2)

Management, professional, and related occupations

62,3787202615051.20.40.8

Management, business, and financial operations occupations

25,8662341171300.90.50.5

Professional and related occupations

36,5134861443751.30.41.0

Service occupations

26,405264245351.00.90.1

Sales and office occupations

32,5842351211280.70.40.4

Sales and related occupations

15,13410957620.70.40.4

Office and administrative support occupations

17,45012564670.70.40.4

Natural resources, construction, and maintenance occupations

14,1043920190.30.10.1

Farming, fishing, and forestry occupations

1,2224040.40.00.4

Construction and extraction occupations

7,985217140.30.10.2

Installation, maintenance, and repair occupations

4,896141400.30.30.0

Production, transportation, and material moving occupations

17,860352343152.01.90.1

Production occupations

8,7857430.10.00.0

Transportation and material moving occupations

9,075345339123.83.70.1

Contingent worker status(2)

Contingent workers, estimate 1

1,958211381.10.70.4

Contingent workers, estimate 2

2,5117962223.12.50.9

Contingent workers, estimate 3

5,85812686452.21.50.8

Noncontingent workers

147,4731,4839046561.00.60.4

Alternative employment arrangement(2)

Independent contractors

10,6145973752645.63.52.5

On-call workers

2,5796847212.61.80.8

Temporary help agency workers

1,3564635113.42.60.8

Workers provided by contract firms

933231852.41.90.5

Workers with traditional arrangements

137,8538825214010.60.40.3

Educational attainment

Total, 25 years and over

134,2771,4439175921.10.70.4

Less than a high school diploma

9,5786453120.70.60.1

High school graduates, no college

33,616284230580.80.70.2

Some college or associate degree

36,0883752651261.00.70.3

Bachelor’s degree and higher

54,9947203703961.30.70.7

Bachelor’s degree only

33,7494031962291.20.60.7

Advanced degree

21,2463171731671.50.80.8
Notes:

(1)Based on usual hours at all jobs combined. Full time is 35 hours or more per week; part time is less than 35 hours.

(2)This refers to the sole or main job; electronically mediated work may be done for the main job, a second job, or other additional work for pay.

Notes: These are estimates of electronically mediated workers as recoded by BLS. Some people did electronically mediated work both in person and online. Estimates for the race groups (White, Black or African American, and Asian) do not sum to totals because data are not presented for all races. People whose ethnicity is identified as Hispanic or Latino may be of any race. An Excel version of this table is available at https://www.bls.gov/cps/electronically-mediated-employment.htm.

Source: Contingent Worker Supplement to the Current Population Survey, U.S. Bureau of Labor Statistics.

Electronically mediated workers were slightly more likely to be men than women, reflecting the fact that, overall, a higher percentage of the employed were men. (See table 4.) Compared with workers overall, electronically mediated workers were more likely to be in the prime-working-age category (25 to 54) and less likely to be in the oldest age category (55 and over). They were also more likely than workers overall to work part time.19

 Table 4. Percent distribution of total employed and electronically mediated workers, by selected characteristics, May 2017
CharacteristicTotal employedElectronically mediated workers
TotalIn personOnline

Total, 16 years and over (in thousands)

153,3311,609990701

Percent of total

100.0100.0100.0100.0

Men

53.254.153.952.7

Women

46.845.946.147.3

Age

16 to 24

12.410.37.415.6

25 to 54

64.471.272.669.5

25 to 34

22.224.924.126.3

35 to 44

20.922.122.520.9

45 to 54

21.424.325.922.4

55 and over

23.118.420.114.9

55 to 64

17.113.615.110.8

65 and over

6.04.84.94.1

Race and Hispanic or Latino ethnicity

White

78.774.669.984.0

Black or African American

12.117.123.06.9

Asian

5.95.84.67.0

Hispanic or Latino ethnicity

16.616.418.513.4

Usual full- and part-time status(1)

Full-time workers

81.772.469.478.1

Part-time workers

18.327.630.621.9

Class of worker(2)

Wage and salary workers

90.162.862.462.6

Private industries

75.859.158.358.9

Government

14.33.84.23.8

Self-employed workers

9.937.237.637.4

Self-employed workers, incorporated

3.67.37.18.8

Self-employed workers, unincorporated

6.229.830.528.6

Industry(2)

Agriculture and related industries

1.60.00.10.0

Mining, quarrying, and oil and gas extraction

0.50.00.00.0

Construction

6.81.21.30.9

Manufacturing

10.41.10.51.9

Wholesale trade

2.20.90.90.8

Retail trade

10.55.95.67.1

Transportation and utilities

5.121.835.01.9

Information

1.94.11.47.5

Financial activities

6.93.32.54.0

Professional and business services

12.131.016.451.2

Education and health services

23.116.319.212.9

Leisure and hospitality

9.36.46.57.1

Other services

4.97.210.04.2

Public administration

4.60.60.60.6

Occupation(2)

Management, professional, and related occupations

40.744.726.471.9

Management, business, and financial operations occupations

16.914.511.818.5

Professional and related occupations

23.830.214.553.5

Service occupations

17.216.424.85.0

Sales and office occupations

21.314.612.218.3

Sales and related occupations

9.96.85.88.8

Office and administrative support occupations

11.47.86.49.5

Natural resources, construction, and maintenance occupations

9.22.42.12.7

Farming, fishing, and forestry occupations

0.80.30.00.6

Construction and extraction occupations

5.21.30.72.1

Installation, maintenance, and repair occupations

3.20.91.40.0

Production, transportation, and material moving occupations

11.621.934.62.1

Production occupations

5.70.40.40.4

Transportation and material moving occupations

5.921.534.21.7

Contingent worker status(2)

Contingent workers, estimate 1

1.31.31.31.2

Contingent workers, estimate 2

1.64.96.23.1

Contingent workers, estimate 3

3.87.88.76.5

Noncontingent workers

96.292.291.393.5

Alternative employment arrangement(2)

Independent contractors

6.937.137.837.7

On-call workers

1.74.24.83.0

Temporary help agency workers

0.92.83.51.5

Workers provided by contract firms

0.61.41.80.6

Workers with traditional arrangements

89.954.852.657.2

Educational attainment

Total, 25 years and over

100.0100.0100.0100.0

Less than a high school diploma

7.14.55.72.0

High school graduates, no college

25.019.725.19.8

Some college or associate degree

26.926.028.921.3

Bachelor’s degree and higher

41.049.940.367.0

Bachelor’s degree only

25.127.921.438.7

Advanced degree

15.822.018.928.3
Notes:

(1)Based on usual hours at all jobs combined. Full time is 35 hours or more per week; part time is less than 35 hours.                                                                            

(2)This refers to the sole or main job; electronically mediated work may be done for the main job, a second job, or other additional work for pay.

Notes: These are estimates of electronically mediated workers as recoded by BLS. Some people did electronically mediated work both in person and online. Estimates for the race groups (White, Black or African American, and Asian) do not sum to totals because data are not presented for all races. People whose ethnicity is identified as Hispanic or Latino may be of any race. An Excel version of this table is available at https://www.bls.gov/cps/electronically-mediated-employment.htm.

Source: Contingent Worker Supplement to the Current Population Survey, U.S. Bureau of Labor Statistics.

Blacks or African Americans accounted for 17 percent of electronically mediated workers, higher than their share of overall employment (12 percent). By contrast, Whites made up 75 percent of electronically mediated workers, slightly lower than their share of workers overall (79 percent). Hispanics or Latinos made up 16 percent of electronically mediated workers, and Asians accounted for 6 percent. Blacks were overrepresented among in-person electronically mediated workers (23 percent), while Whites were overrepresented among online workers (84 percent).

Educational attainment data are restricted to those age 25 and over because most people have completed their education by that age. Compared with workers overall, people who did electronically mediated work were more likely to have a bachelor’s degree or higher. This was driven by people who did their tasks entirely online; 67 percent of online electronically mediated workers had a bachelor’s degree or higher.

Self-employed workers were more likely than wage and salary workers to do electronically mediated work (4 percent versus 1 percent). (See table 3.) Five percent of self-employed workers whose businesses were unincorporated did such work, as did 2 percent of the self-employed with incorporated businesses.

By industry, workers in transportation and utilities (main job) were the most likely to have done electronically mediated work, with 5 percent of workers in this industry having done such work. Those employed in professional and business services, information, and other services were also more likely to do electronically mediated work, at 3 percent, 2 percent, and 2 percent, respectively. Workers in transportation and utilities and in other services were more likely to do in-person work, and those in professional and business services and in information were more likely to do online work.

Workers in the four alternative employment arrangements measured in the CWS—independent contractors, temporary help agency workers, on-call workers, and workers provided by contract firms—were more likely than workers in traditional arrangements to have done electronically mediated work. Independent contractors were the most likely to do electronically mediated work—6 percent did so in May 2017, compared with 3 percent of temporary help agency workers, 3 percent of on-call workers, and 2 percent of workers provided by contract firms. By contrast, less than 1 percent of workers in traditional arrangements were electronically mediated workers.

Lessons learned

BLS should not again attempt to collect data about electronically mediated work using the four new questions fielded in the May 2017 CWS. If BLS were to collect data about electronically mediated work in the future, the questions would need to be substantially revised. It may simply be that the concepts are too complicated for four questions to properly identify all the information BLS was attempting to measure.

BLS recognizes that a number of steps could be taken to improve future questions on all subjects, not simply those that concern electronically mediated work. Such strategies address ways to improve question development, cognitive testing, and interviewer training.

Cognitively test participants who are not in the population of interest

For complicated questions that are intended to measure a small portion of the population, it is important to test participants who do not fit the characteristic of interest as well as those who do. For the laboratory testing, BLS made the strategic decision to recruit people who had done in-person electronically mediated work or were in occupations with a large number of electronically mediated workers. These laboratory participants may have been more familiar with the concepts than people in occupations with little or no electronically mediated work. For example, even if they do no electronically mediated work, many taxi drivers may be familiar with such work because they know about Uber and Lyft drivers. They may be able to answer the questions more easily than people in occupations that have few electronically mediated workers, such as schoolteachers or firefighters. For the mTurk testing, all participants were at least somewhat familiar with electronically mediated work since they were using the mTurk platform. Thus, mTurk testing participants were doubtless more likely to be familiar with electronically mediated work than some others would have been. An increased sample size with a wider variation of occupations and work arrangements may have provided increased insight about the potential for false positives.

Conduct additional rounds of cognitive testing

Conducting multiple rounds of cognitive testing is a recommended best practice, particularly when questions involve complicated concepts or revisions are made during testing. Because of time and funding constraints, BLS committed to a tight timeframe for cognitive testing and conducted only one round of testing. Improvements may have been realized if—after analyzing the cognitive interview results and revising the questions—additional interviews had been conducted to evaluate the revisions.

Improve interviewer training

Interviewers clearly could have benefited from additional training. Because CPS interviewers are located all over the country, in-person classroom training would be prohibitively expensive. Less expensive ways to improve training include the following:

·       Additional time for the self-study material

·       Computer-based training modules with graded quizzes

·       Online training sessions

·       Web-based training or teleconferences

·       Practice interviews

Increase interviewer involvement in questionnaire and training design

CPS interviewers have a keen sense about what might be confusing to other interviewers and respondents. Meeting with interviewers prior to finalizing the questionnaire to identify possible problems with questions could improve both the questionnaire and training material. It could have an added benefit of increasing interviewer engagement.

Continue to learn from research

Electronically mediated work is an area that is being increasingly studied. Since the 2017 CWS was fielded, new findings have emerged that could have been helpful in the question design process. For example, Statistics Finland, when attempting to measure “platform jobs,” found that they needed to include company names in the question in order to get accurate responses.20 BLS should keep abreast of new research, continue to work with outside experts, and leverage the efforts of others when designing questions.

Appendix A. Guidelines for recoding

Each record was reviewed by multiple staff members. In order to determine whether responses were correct, they looked at the following pieces of information: 

·       Verbatim descriptions of industry, occupation, and duties on the job

·       Company name

·       Class of worker (whether wage and salary worker or self-employed)

·       Whether the person is an independent contractor, an on-call worker, a temporary help agency worker, or a worker provided by a contract firm on the main job

·       The number of hours a person usually works

Answers were not recoded if there was not enough information to determine, with a high probability, that a person was not doing electronically mediated work.

Table A-1. Guidelines for assessing whether workers do electronically mediated work
Type of workLess likely to have done electronically mediated workMore likely to have done electronically mediated work
In-person electronically mediated workIf the person works:
·       In a management occupation
·       As a real estate agent
·       In sales
·       In manufacturing or mining
·       In an occupation that requires extensive infrastructure to provide the service
·       For the federal, state, or local government
If the person:
·       Is a driver or delivery person
·       Works in home healthcare
·       Does chores or other short-term work
·       Works in an occupation where customers typically only need a worker for a short or fixed period
·       Usually works few hours per week
If the person:
·       Does NOT work for a business that connects people with clients through a website or mobile app OR
·       Is NOT paid by or through a businesses website or mobile app that connects people with clients or custormers OR
·       Is NOT doing in-person work
If the person:
·       Works for a business that connects people with clients or customers through a website or mobile app AND
·       Is paid by or through the business that owns the website or mobile app AND
·       Is doing in-person work
Electronically mediated work that is done entirely onlineIf the person works:
·       In a management occupation
·       As a real estate agent
·       In sales
·       In manufacturing or mining
·       As a driver or delivery person
·       For the federal, state, or local government
If the person:
·       Does data entry, answers surveys, or assesses internet sites
·       Does copyediting, translating, or graphic design
·       Does data analysis or programming
·       Does digital marketing or social media analysis
·       Does online tutoring or course development
·       Works in an occupation that requires no face-to-face interaction
·       Usually works few hours per week
If the person:
·       Does NOT work for a business that connects people with clients through a website or mobile app OR
·       Is NOT paid by or through a business website or mobile app that connects people with clients or customers OR
·       Is not doing work entirely online
If the person:
·       Does work for a business that connects people with clients or customers through a website or mobile app AND
·       Is paid by or through the business that owns the website or mobile app AND
·       Is doing work entirely online
Source: U.S. Bureau of Labor Statistics.

Appendix B. Detailed information on the effect of the BLS data recoding process

Table B-1. Electronically mediated work as collected and as recoded, by selected characteristics, in thousands, May 2017
CharacteristicTotal employedElectronically mediated workers
RecodedCollected
TotalIn personOnlineTotalIn personOnline

Total, 16 years and over

153,3311,6099907015,0573,0212,969

Men

81,5458705343702,6501,6471,500

Women

71,7857394563312,4071,3741,469

Age

16 to 24

19,05416673110471259306

25 to 54

98,8011,1467184883,4892,0782,042

25 to 34

33,9914012391841,170700738

35 to 44

32,0653552231461,207696687

45 to 54

32,7453902571571,112682617

55 and over

35,4762971991041,098683622

55 to 64

26,23621915075808481459

65 and over

9,240774929290202163

Race and Hispanic or Latino ethnicity

White

120,6381,2006925893,9832,3542,398

Black or African American

18,58827622848664406365

Asian

9,110934549279180135

Hispanic or Latino ethnicity

25,52526518394821520457

Usual full- and part-time status(1)

Full-time workers

125,2401,1656875484,1392,3512,559

Part-time workers

28,091444303154918670410

Class of worker(2)

Wage and salary workers

138,1831,0116184393,9482,1962,432

Private industries

116,3009505774133,3741,9202,061

Government

21,884614127574277371

Self-employed workers

15,1475983722621,109824538

Self-employed workers, incorporated

5,5751187061370265196

Self-employed workers, unincorporated

9,572480302201739560341

Industry(2)

Agriculture and related industries

2,49811013510

Mining, quarrying, and oil and gas extraction

77500010110

Construction

10,48420136223156116

Manufacturing

15,98418513302140193

Wholesale trade

3,38315961026662

Retail trade

16,131965650501330260

Transportation and utilities

7,77335134613495420152

Information

2,89466145315052126

Financial activities

10,640522528483291285

Professional and business services

18,5284991623591,006495727

Education and health services

35,38426219090932510589

Leisure and hospitality

14,2441046450317215150

Other services

7,5171159930299227142

Public administration

7,0951064224112149

Occupation(2)

Management, professional, and related occupations

62,3787202615052,4991,2381,711

Management, business, and financial operations occupations

25,8662341171301,207667761

Professional and related occupations

36,5134861443751,292572950

Service occupations

26,40526424535569474221

Sales and office occupations

32,5842351211281,220672776

Sales and related occupations

15,1341095762589387333

Office and administrative support occupations

17,4501256467631285443

Natural resources, construction, and maintenance occupations

14,104392019225154122

Farming, fishing, and forestry occupations

1,222404818

Construction and extraction occupations

7,985217141187775

Installation, maintenance, and repair occupations

4,89614140997639

Production, transportation, and material moving occupations

17,86035234315543483139

Production occupations

8,785743775832

Transportation and material moving occupations

9,07534533912466425107

Contingent worker status(2)

Contingent workers, estimate 1

1,95821138854759

Contingent workers, estimate 2

2,51179622215810095

Contingent workers, estimate 3

5,8581268645285167180

Noncontingent workers

147,4731,4839046564,7722,8542,790

Alternative employment arrangement(2)

Independent contractors

10,614597375264980750489

On-call workers

2,57968472117511388

Temporary help agency workers

1,356463511634043

Workers provided by contract firms

93323185553820

Workers with traditional arrangements

137,8538825214013,7982,0932,329

Educational attainment

Total, 25 years and over

134,2771,4439175924,5862,7612,663

Less than a high school diploma

9,57864531221317278

High school graduates, no college

33,61628423058837575407

Some college or associate degree

36,0883752651261,282741753

Bachelor’s degree and higher

54,9947203703962,2551,2741,424

Bachelor’s degree only

33,7494031962291,387782908

Advanced degree

21,246317173167868492516
Notes:

(1)Based on usual hours at all jobs combined. Full time is 35 hours or more per week; part time is less than 35 hours.                                                                     

(2)This refers to the sole or main job; electronically mediated work may be done for the main job, a second job, or other additional work for pay.

Notes: Some people did electronically mediated work both in person and online. Estimates for the race groups (White, Black or African American, and Asian) do not sum to totals because data are not presented for all races. People whose ethnicity is identified as Hispanic or Latino may be of any race. An Excel version of this table is available at https://www.bls.gov/cps/electronically-mediated-employment.htm.

Source: Contingent Worker Supplement to the Current Population Survey, U.S. Bureau of Labor Statistics.

Suggested citation:

Current Population Survey staff, "Electronically mediated work: new questions in the Contingent Worker Supplement," Monthly Labor Review, U.S. Bureau of Labor Statistics, September 2018, https://doi.org/10.21916/mlr.2018.24

Notes


1 The Current Population Survey (CPS) is jointly sponsored by U.S. Bureau of Labor Statistics (BLS) and the U.S. Census Bureau and is best known for the national unemployment rate. Statistics from the CPS, widely used by policymakers and researchers, are among the country’s most timely economic indicators. The CPS provides extensive information about the employment and unemployment status of the population, and the survey data can be broken out by a variety of demographic characteristics, such as race, ethnicity, educational attainment, disability status, age, and gender. In addition, the CPS is a primary source of socioeconomic data about the labor force, including industry, occupation, hours of work, and earnings. In most months of the year, the monthly CPS questions are followed by supplementary questions about a particular topic. The Contingent Worker Supplement (CWS) is one such supplement. The Department of Labor’s Chief Evaluation Office sponsored the May 2017 CWS.

2 In order to limit respondent burden, surveys are often designed so that respondents are asked questions based on how they answered earlier questions. In the CPS, for example, people who said they have a job are asked a series of questions about their employment, questions that are not asked of people without jobs. The ways respondents are routed through the survey questions are referred to as “skip patterns.” Also, the wording of particular questions is often conditional on information obtained earlier in the survey. For instance, a question about a particular household member may include a “fill” of the household member’s name.

3 In addition to electronically mediated workers and platform workers, other terms used to describe people who do this type of work include the following: online gig workers, workers providing services through digital matching firms, e-lancers, sharing economy workers, on-demand economy workers, digitally matched workers, peer-to-peer economy workers, collaborative economy workers, and electronically intermediated workers.

4 For example, see Devin Fidler, “Work, interrupted: the new labor economics of platforms,” Institute for the Future, 2016, http://www.iftf.org/fileadmin/user_upload/downloads/wfi/IFTF_Work-Interrupted_FullReport.pdf; Jonathan V. Hall and Alan B. Krueger, “An analysis of the labor market for Uber’s driver-partners in the United States,” Princeton University, Industrial Relations Section, Working Paper 587, January 2015, https://dataspace.princeton.edu/jspui/handle/88435/dsp010z708z67d; Jane Dokko, Megan Mumford, and Diane Whitmore Schanzenbach, “Workers and the online gig economy,” The Hamilton Project, December 2015, http://www.hamiltonproject.org/papers/workers_and_the_online_gig_economy; Sarah A. Donovan, David H. Bradley, and Jon O. Shimabukuro, “What does the gig economy mean for workers?” Congressional Research Service, R44365, February 5, 2016, https://fas.org/sgp/crs/misc/R44365.pdf; Seth D. Harris and Alan B. Krueger, “A proposal for modernizing labor laws for twenty-first-century work: the ‘independent worker,’” Discussion Paper 2015–10, The Hamilton Project, December 2015, http://www.hamiltonproject.org/assets/files/modernizing_labor_laws_for_twenty_first_century_work_krueger_harris.pdf; Sara Horowitz and Fabio Rosati, “53 million Americans are freelancing, new survey finds,” Freelancers Union, September 4, 2014, https://blog.freelancersunion.org/2014/09/04/53million/; Rudy Telles, Jr., “Digital matching firms: a new definition in the ‘sharing economy’ space,” ESA Issue Brief no. 01-16 (Economics and Statistics Administration, June 3, 2016); and Diana Farrell and Fiona Greig, “Paychecks, paydays, and the online platform economy: big data on income volatility,” J.P. Morgan Chase Institute, February 2016, https://institute.jpmorganchase.com/content/dam/jpmc/jpmorgan-chase-and-co/institute/pdf/jpmc-institute-volatility-2-report.pdf.

5 See Hall and Krueger, “An analysis of the labor market for Uber’s driver-partners in the United States”; and Lawrence Mishel, “Uber and the labor market: Uber drivers’ compensation, wages, and the scale of Uber and the gig economy,” Economic Policy Institute, May 15, 2018, https://www.epi.org/publication/uber-and-the-labor-market-uber-drivers-compensation-wages-and-the-scale-of-uber-and-the-gig-economy/.

6 The CWS is asked of employed people who are not unpaid family workers on their main job. The survey also includes a few respondents who were not employed. These respondents are asked about their last job.

7 For example, see Katharine G. Abraham and Ashley Amaya, “Probing for informal work activity,” National Bureau of Economic Research, Working Paper 24880, August 2018, http://www.nber.org/papers/w24880; and Anat Bracha and Mary A. Burke, “Who counts as employed? Informal work, employment status, and labor market slack,” Federal Reserve Bank of Boston, Working Paper 16-29, December 2016, https://www.bostonfed.org/publications/research-department-working-paper/2016/who-counts-as-employed-informal-work-employment-status-and-labor-market-slack.aspx.

8 Respondents who have indicated that someone in the household has a farm or business are asked “LAST WEEK, did you do ANY work for either pay or profit?”

9 In order to investigate possible measurement error in the classification of labor force status in the CPS and other surveys using labor force questions similar to those in the CPS, BLS researchers investigated data on income-generating activities from the American Time Use Survey (ATUS), data that are not available in the CPS. The evaluation of the ATUS data indicated that, while there may be some misclassification of workers in surveys that use questions similar to the CPS labor force questions, the effect on the total employment estimate is likely small. See Mary Dorinda Allard and Anne E. Polivka, "Measuring labor market activity today: are the words work and job too limiting for surveys?," Monthly Labor Review, U.S. Bureau of Labor Statistics, November 2018, https://doi.org/10.21916/mlr.2018.26.

10 For example, see Ajay Agrawal, John Horton, Nicola Lacetera, and Elizabeth Lyons, “Digitization and the contract labor market: a research agenda,” in Avi Goldfarb, Shane M. Greenstein, and Catherine E. Tucker, eds., Economic Analysis of the Digital Economy (Chicago, IL: University of Chicago Press, 2015), pp. 219–50; Valerio De Stefano, “The rise of the ‘just-in-time workforce’: on-demand work, crowdwork and labor protection in the ‘gig economy,’” Comparative Labor Law and Policy Journal, vol. 37, no. 3, June 2016, pp. 471–504; Alek Felstiner, “Working the crowd: employment and labor law in the crowdsourcing industry,” Berkeley Journal of Employment and Labor Law, vol. 32, no. 1, 2011, pp. 143–203; and Gordon Burtch, Seth Carnahan, and Brad N. Greenwood, “Can you gig it? An empirical examination of the gig-economy and entrepreneurial activity,” Ross School of Business Paper no. 1308, March 2016, University of Michigan, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2744352.

11 For example, see Miriam A. Cherry, “A taxonomy of virtual work,” Georgia Law Review, vol. 45, no. 4, summer 2011, p. 968, https://heinonline.org/HOL/Page?collection=journals&handle=hein.journals/geolr45&id=1022; Karën Fort, Gilles Adda, and K. Bretonnel Cohen, “Amazon Mechanical Turk: gold mine or coal mine?” Computational Linguistics, vol. 37, no. 2, June 2011, pp. 413–20; David Martin, Benjamin V. Hanrahan, Jacki O’Neill, and Neha Gupta, “Being a turker,” in Proceedings of the 17th ACM Conference on Computer Supported Cooperative Work and Social Computing, pp. 224–35, 2014; and Kotaro Hara, Abigail Adams, Kristy Milland, Saiph Savage, Chris Callison-Burch, and Jeffrey P. Bigham, “A data-driven analysis of workers’ earnings on Amazon Mechanical Turk,” in Proceedings of the 2018 CHI Conference on Human Factors in Computer Systems, 2018.  

12 An extensive list of companies involved in electronically mediated work that existed when the questions were being developed can be found in Rudy Telles, Jr., “Digital matching firms.”

13 For Office of Management and Budget standards and guidelines for statistical surveys, see https://obamawhitehouse.archives.gov/sites/default/files/omb/inforeg/statpolicy/standards_stat_surveys.pdf.

14 For Office of Management and Budget standards and guidelines for cognitive interviews, see https://obamawhitehouse.archives.gov/sites/default/files/omb/inforeg/directive2/final_addendum_to_stat_policy_dir_2.pdf.

15 For the final cognitive testing report, see appendix H of the Office of Management and Budget clearance for the 2017 CWS.

16 The data collection instrument allowed “yes” answers for both questions.

17 Households selected for the CPS are in the sample for 8 months total over a 16-month period. Detailed information about multiple jobs is collected only in the 4th and 8th months in which households are in the sample.

18 The full list of keywords follows: Clickworker, Click Worker, Crowdflower, Crowd Flower, Crowdsource, Crowd Source, Favor, Fiverr, freelance, Grubhub, Grub Hub, Handy, Instacart, Insta Cart, Lyft, Mechanical Turk, Microworkers, Micro Workers, Minijob, Mini Job, Onespace, One Space, Postmates, Post Mates, Rapidworker,  rideshare, ride share, ridesharing, ride sharing, Shorttask, Short Task, Taskrabbit, Task Rabbit, taxi, Turk, Uber, Upwork, Washeo, Washio, and .com. The search was not case sensitive.

19 Part-time workers are defined as those who usually work less than 35 hours per week at all jobs combined.

20 Hanna Sutela, “Platform jobs are here to stay—how to measure them?” Statistics Finland, April 17, 2018, http://www.stat.fi/tietotrendit/blogit/2018/platform-jobs-are-here-to-stay-how-to-measure-them/.

article image
close or Esc Key