Last spring, a photo made the rounds in nanny and parenting Facebook groups. The photographer or context unknown, it depicted a toddler in a stroller who had allegedly been left unattended by a nanny outside a coffee shop. As the photo circulated among a group of New York City nannies at a meetup I attended, another question was brought up: What was the nanny’s race? If she was black and Caribbean, like most of the women at this meetup were, they all stood to have their reputations tarnished. Racial stereotypes, one of the nannies said, were “like a knife,” hidden but ready to be brandished against them at the first local news incident or social media–fueled rumor.
Everyone shook their heads in disapproval of the neglectful nanny’s conduct. But they also empathized with the experience of being spied on and publicly shamed online. They all knew that if the nanny had been photographed, her career would have been jeopardized, and possibly over. That’s because online spaces have enabled new kinds of employer surveillance.
In-home care workers are used to being watched. Surveillance of domestic employees has been a common practice, whether it involves posting surreptitious photos of nannies online or installing “nanny cams” at home (legal in all 50 states, with or without consent). As this surveilling impulse has thrived online, it has also influenced the hiring process. Digital first impressions have become a stand-in for older, more involved ways of brokering relationships between care workers and employers. Traditionally, agencies and referrals from past employers played a more significant role in this process. In our research, we found that a growing number of nannies, babysitters, home health aides, and house cleaners are instead using online marketplaces like Care.com and similar sites to find work.
These places can feel a lot like online dating sites, encouraging employers to rely on gut instincts when browsing prospective hires, who are encouraged to upload profile pictures; upload videos of themselves to highlight bubbly, warm, or fun personalities; write up biographical narratives; or add links to their social media accounts. Employers can filter search results by ZIP code or other criteria, such as nonsmokers, and initiate contact via private message with individual care workers or vice versa. And like dating apps, they seem to offer a near endless array of options to choose from.
A hiring guideline on Care.com, one of the largest marketplaces for hiring domestic work, advises that while background-check companies avoid social media screening in order to comply with employment discrimination law, doing the online snooping on prospective hires yourself is fair game and even encouraged: “Many people have no privacy settings on their personal pages and treat them like a tell-all. Look for objectionable photos and status updates. Pay careful attention to the person’s Facebook friends and those who comment on their page.” Even a job candidate’s Pinterest board, the guide adds, might hold further clues to her character and fitness to care for your child.
In other employment contexts, this type of scrutiny might get backlash for being invasive or potentially discriminatory. But the hiring norms for domestic work have always been different, in large part due to the intimacy of the work. Entrusting someone with caring for one’s child or elderly parent requires an immense degree of intimacy and can be an anxiety-inducing process. Moreover, few formal pathways for skills credentialing—such as certificates or degrees that stand as evidence of one’s qualifications—exist for this line of work, so employers rely on more intangible qualities, like personality.
Online marketplaces have attempted to assuage these anxieties by trying to scale and fast-track the process of building trust between employers and care workers. They do this in part by implementing rating systems, increasingly common in industries spanning ride-hailing to the restaurant industry, as one of several ways that workers can be assessed by past employers. As my colleagues wrote in 2016, ratings give these judgments an air of objectivity, even if they may be reproducing the biases of consumers. In our research, we found that employers sometimes leave bad ratings as a form of retaliation against workers on these platforms for understandable actions such as turning down a job offer. Like the nanny photos and rumors that informally circulate on social media, bad ratings can damage a care worker’s reputation and access to work but come with little context for potential employers to assess their fairness.
Most domestic workers are well aware of the prevailing hierarchies in the industry. The National Domestic Workers Alliance reports that black and Latina domestic workers across the U.S. are paid less than their white counterparts for the same work. Other research has shown that a care worker’s race and/or immigration status inform the expectations placed on her by employers and limit access to some jobs. For instance, an immigrant woman of color may be more likely to be expected by employers to do double duty as a house cleaner for no additional pay. Given that workers are judged by different, stereotype-driven rubrics, then the reputation systems built into online marketplaces are likewise going to be uneven playing fields.
Compounding the problem is that these biases can be hard to recognize and prove. I interviewed a West African immigrant in her 50s who was struggling to find work on Care.com’s app in between her long hours working as a hotel housekeeper at a major hotel chain. Prior to her hotel job, she had been a seasoned in-home care worker, with more than a decade of experience working as a nanny. But the kids she cared for grew up, and she had to find new work. The hotel work is hard on her health, the commute from the Bronx is long, and the long hours leave her too exhausted at the end of each day to complete her English homework for the language courses she was taking.
While she applied to dozens of jobs each day on her somewhat battered but still functional smartphone, she was, at that point, unable to secure even an interview. She had a suspicion that her Care.com profile was to blame: It was only partially filled out because she had found many of the sections to be confusing. While she had done her best to narrate her qualifications in broken English, she struggled to replicate the résumélike prose that online bios seem to demand. And ultimately, what did her ability to self-brand have to do with her ability to care for people’s children?
Online marketplaces may not be the root cause of individual employers’ biases, but their design is not neutral. They are built with a particular archetype of what an “entrepreneurial” domestic worker looks like—one who feels at home in the world of apps, social media, and online self-branding—and ultimately replicates and can even exacerbate many of the divisions that came with our predigital workplaces. As platform companies gain growing power over the hiring processes of a whole industry, they will need to actively work against the embedded inequalities in the markets they now mediate.
The kinds of technology-enabled scrutiny by employers—from stroller-spying to Facebook snooping—can leave care workers feeling hypervisible. But in a context where a Care.com search for a “child care provider” within a 10-mile radius of a Manhattan ZIP code turns up more than 34,000 care worker profiles, the other risk is not being seen at all.