Identity thieves and other criminals seek to appropriate children’s identities. The web of “big data” analytics isn’t interested in taking identities but in doing things with them.1 Big data seeks to use children’s data to affect kids’ activities, opportunities, and trajectories, while also furthering the goals of the data user itself. The identity thief takes a Social Security number, date of birth, and mailing address to obtain a credit card in the child’s name. Big data isn’t a thief. The activities in this realm are usually lawful or at least close to the legal line. But the big data ecosystem can operate in a sketchy space. It can silently take pieces of information from children and their adults, mine them for more information, and reshare that information with an unknown number of others for unspecified ends.
At this point, what is your intuition? Do the bad actors—the identity thieves and others—seem risky enough to lead you to think differently about sharenting? Or are you inclined to see the threat they pose as more avalanche (terrifying but rare) than snowstorm (dangerous but manageable)? How does your risk assessment change, if at all, if you think more about the snowstorm scenario than the avalanche? If you live in an area with winter weather, at some point, you probably will drive in a snowstorm. And if you engage in sharenting, private information about your children will go through the big data blizzard. Time to hit the brakes?
Let’s say that the identity thief is like the stereotypical burglar who breaks into your house and takes your stuff: you’re left without your possessions and harmed by this loss. The big data thief could be seen as more akin to a customer at a yard sale who buys the old bureau you inherited from your grandmother that you think is worthless, discovers a treasure trove of family photos and other heirlooms inside, and keeps the stash for herself.
That’s a helpful but incomplete analogy. Let’s look at where it works. Big data isn’t stealing. You’re welcoming it. You might be rolling out the welcome mat to big data because you don’t realize it’s there. You might know it’s there but think it’s helping you. And perhaps it is helping you or at least not hurting you.
Let’s look at where this analogy breaks down. In your interactions with digital tech and associated big data, you are typically deriving immediate benefits that go beyond the removal of an unwanted possession. To make the analogy more accurate, big data might be like the yard sale customer who gives you a new dresser for free and takes away your old one. The analogy also breaks down because big data doesn’t typically deprive you of the use of any of your own information that you generate yourself: it just uses it for its own purposes. To make the analogy even more accurate, our yard sale customer might leave you a duplicate set of everything she found in the dresser and then use the set she took for her own purposes. The analogy also breaks down because, in big data world, you are continually creating new data by engaging the digital tech. It’s not a finite set of valuables that you leave in the form of a data trail but an ever-growing set. And it’s an ever-growing set that wouldn’t exist but for the digital tech that you are using.
Now we’re at the point with our analogy where the yard sale customer takes away your old dresser and gives you a new one. For as long as you have it, that new dresser continues to give you new benefits, like a sock matcher so you never lose any socks again. What’s so bad about your magic wardrobe? You’re starting to think you might actually find Narnia3 after all these years! Well, you might. But it might also be that, instead of a witch waiting for you on the other side of the wardrobe, the wardrobe itself is bewitched. It starts learning a lot of things about you and your family that you don’t even realize it is learning.
Let’s say you’re using the magic wardrobe to house your daughter’s clothes.4 The wardrobe is perfectly matching her socks, but you don’t notice that it is making a copy of each sock. The wardrobe also selects your daughter’s outfit for each day and coordinates the socks with the outfit. How does the wardrobe know how to produce an outfit that is perfect for the day’s events? You gave the wardrobe permission, when it arrived, to communicate with your iPhone calendar via an embedded sensor in the back of the wardrobe. The sensor system is also linked to sensors in your daughter’s clothes, so the smart wardrobe combines what it learns from your iPhone calendar to tell you what your daughter should wear.
Forget the Lion and the Witch: it’s like Mary Poppins has taken up residence in this wardrobe! You’re loving this helpful magic so much that you don’t think about what else the magic wardrobe is learning about your daughter. You don’t ask if it’s figuring out how she’s doing in school from your calendar entry that reads “Parent-teacher conference re: bullying issue @ 2 p.m.” You don’t think about whether it’s figuring out how fast she’s growing from reading her clothing tags. You don’t wonder if it’s keeping its discoveries to itself. Your daughter looks awesome, and you have five to ten minutes more each morning to Instagram her #girlpower pics.
What you’re actually doing with your children’s data in real life is a lot like this magical wardrobe. In exchange for free or inexpensive access to efficient, engaging, interactive digital services and products, you are sharing an ever-expanding amount of your children’s personal information with those tech providers. You likely don’t realize how much data you are sharing or how that tech provider can use your children’s information and allow an indeterminate number of unidentifiable third parties to use it too. We don’t need make-believe to find ourselves in a veritable Fantasia of spying objects.
Let’s move from make-believe enchanted objects to the real-life enchanted objects and other forms of digital tech you’re likely using today. Facebook can add your post about toilet training dilemmas as a data point to its own information about you, as well as whatever information it is sharing with third parties. Barbie, Elmo, new nanny: it’s all data. The question isn’t “Who might be interested in this kind of dossier on kids?” but “Who wouldn’t be?”
Is this stuff happening already? Yes, it is. We are only beginning to understand the methods and the scope. The rapid pace of tech innovation, the lack of transparency in many major data-related markets, and other factors combine to keep us, as security expert Bruce Schneier tells it, the David to the Goliath of big data.5
Here’s what we do know. Federal and state laws impose almost no limits on the ability of parents to share information about their kids online.6 As soon as private individuals, companies, or nonprofits receive this information from parents, there are few legal limits on what they can do with it.
Those limits that do exist come from general bodies of law or laws that apply to the receiving people or entities, not specific statutory and regulatory schemes that address parents’ legal rights to divulge their children’s private information. Some significant limits include those from criminal law. Parents can’t steal their children’s identities, manufacture child porn, or commit other crimes against or involving their children. Consumer law and contract law require companies to follow their own terms of service and policies, best-practice commitments, and other commitments they make regarding how they will use children’s data.
A federal statute, the Children’s Online Privacy Protection Act, does limit what many private companies can do to collect and use information directly from children under age thirteen. The limit? The covered companies need to have a parent’s permission before collecting and using the data.7 Similar legal limits exist for teachers: they need to obtain parental consent before sharing students’ private data, unless an exception applies.8
There are government actors and institutions outside of education where parental consent is not dispositive. For instance, a juvenile court may be legally barred from sharing information about a child’s court case even with parental permission.
But we now find ourselves back more or less where we started: parents stand at the center of a largely consent-based framework for the digital distribution and use of their children’s private data. After they have consented to digital data sharing about their kids, either by doing it themselves or allowing other adults and institutions to do it, the data can travel at warp speed across entities and time.9
Data brokers facilitate this movement by aggregating and analyzing digital data. Brokers then sell this data to third parties. Data buyers then use discrete data points or larger data sets to engage in data-driven decision making for their own purposes.10 Private companies that collect, store, and share relevant data with individuals and institutions that are willing to pay are not a new idea. Holdovers from last century include the consumer credit bureaus, real estate brokers, and employment headhunters.
We are now at a phase of data broker development where, in the words of a former secretary of defense, Donald Rumsfeld, there seem to be more “known unknowns” than there are “known knowns.” We know that data brokers serve a range of constituencies. Brokers are loosely regulated, and those regulations that do exist are largely industry-specific. For instance, data brokers who are functioning as credit reporting agencies are bound by regulations on credit-based decision making.11 Some brokers are collecting information about children.12
Data brokers are typically “opt out” rather than “opt in,” and the process for removing your information from a brokerage firm is variable, difficult, and possibly of limited utility. We don’t know exactly how many data brokers there are. We don’t know how all of them gather their data, with whom they’re sharing it, and for what purposes. We can’t easily find out if and how we could dispute data points inside their black boxes. Data integrity is a problem, but so is data safety. We can’t easily read the news to see if our or our children’s data has been caught up in a data breach of a given data broker because we don’t know which brokers have our data.
But we do know that children’s data is a hot commodity for data brokers.13 We have a decent sense of some key markets where data brokers’ offerings are, may be, or are likely to come into use. These include credit, insurance, education, and employment.14 We don’t know what stealth, emerging, or future markets may exist for children’s data. For instance, as more data about ever-younger children, even at the preconception and gestation stages, is gathered and analyzed, might there be a market for “life” insurance for the embryo or fetus before it is born? Today, parents can buy life insurance for their children. If an insurance broker can aggregate data from a fertility app and other sources, it could offer an insurance product for expectant parents that would cover medical and other costs, as well as the emotional and psychological consequences of a lost pregnancy.
Fetal life insurance doesn’t seem to be a real thing. Yet. But it isn’t the stuff of pure speculation. Target was able to determine which of its customers were pregnant and advertise accordingly, which outed a pregnant teen to her parents.15 Target does this type of customer data analytics based on data from both its brick-and-mortar stores and its digital engagement.16 Who knew that buying cotton balls meant there was a bun in the oven?
In the digital commerce sphere, there are companies whose mission is offering health and wellness services specifically for reproductive functions. For example, HelloFlo “offer[s] one-of-a-kind care packages to help women and girls through transitional times in their life. As well, we have content that will educate, inspire[,] and entertain you.”17 The company began as a subscription service that focused on delivering tampons and pads through a “reminder service that also deliver[s] the right products at the right time.”18 Its initial advertisement for a “Period Starter Kit,” aimed at preteen and teen girls, was an “ever-so-slightly subversive viral hit.”19
HelloFlo assures its users that it will protect their privacy. It follows up with broad language around product and service development that is typical of digital companies: “We may also draw upon [your] Personal Information in order to adapt the Services of our community to your needs, to research the effectiveness of our network and Services, and to develop new tools for the community.”20 With “Personal Information” that presumably contains data about a user’s menstrual cycle and other health matters, there appears to be fertile ground for conceiving new commercial tools related to intimate life.
So some parts of the private market, from big-box stores to digitally based health and wellness companies, are already gathering and, in some instances, mining reproductive health data to make sales. It is reasonable to expect that private markets will develop new products and services based on this data. The pitch for prenatal life insurance practically writes itself: “A prenatal vitamin a day protects baby on its way. For one dollar more, ensure your heart won’t be sore if the stork misses your door.” Don Draper might cringe at the lyrics, but he’d admire the entrepreneurial spirit.
Back from the plausible to the present: many of the individual and institutional decision makers who make key choices about kids’ futures already use or are likely to start using some form of digital data–driven decision making to inform their decisions. The scope and types of data tools vary. Some may purchase profiles from data brokers, while others run their own in-house or even ad hoc operations. But the general takeaway is unambiguous: the gatekeepers to services and opportunities that are likely to matter most to young people’s futures are using digital data trails to decide whether gates open or stay barred.
On the education front, we know that colleges look at the social media profiles of applicants. They certainly look at applicants’ educational records. And the “use of predictive analytics [in college admissions] generated from big data sources such as social media postings, test scores, and demographic data faces few legal limits. No law prohibits colleges from gathering information about students from social media or other publicly available information.”21 Employers seem to be increasingly “integrating the screening of social media profiles [of job applicants] in Applicant Tracking Systems.”22 And “there are currently no restrictions in place to protect against discrimination on the basis of one’s personal [social] network.”23
So are schools and employers looking at what parents say publicly about their kids? Most likely. Do we know how educational and professional decision makers will react to such information? No.24 Your kids could be flagged as better or worse candidates depending on the profiles you have created for them on public-facing websites, like blogs. The potential impact of your social media content is less clear if that content is not publicly available. But even if we don’t currently have data brokerage of kids’ information from their parents’ private social media posts, if social media companies change their policies, we could.
We know that some insurance companies already use smart technologies and other predictive analytics to help calculate risk and premiums. This practice is likely to expand as kids who have been using “enchanted” items like the Owlet bootie from birth come of age and apply for more insurance products.25 We know that the consumer credit industry is looking at ways to score you based on your social media engagement.26 Your kids might not yet be applying for credit cards, but when they do, will what you said about them on social media be part of the card issuers’ decision-making process?
We are also seeing evidence that actors in government and politics increasingly rely on digital data in their activities. These activities carry potentially far more serious consequences than getting into a certain college. They strike at the heart of young people’s ability to participate in democracy and enjoy the protections of democratically created and maintained civil rights and liberties: “A society that permits the unchecked ascendancy of surveillance infrastructures cannot hope to remain a liberal democracy.”27 As kids come of age and vote for the first time, what digital content will be served up to them, and how will it be determined? We know that there is a growing role for personalized, microtargeted content that is related to democratic participation.28 Imagine the precision of the microtargeting that could be done on Tommy S. and his cohort: kids conceived during a full moon respond favorably to ads showing furry people or furry friendly monsters, preferably red.
We don’t need to look into the future to find an interplay between kids’ personal digital data and the public sphere. Governmental actors’ use of digital data also has consequences for kids during their childhood. We know that governmental actors use various forms of surveillance, including facial-recognition software, and predictive analytical tools to engage in digital monitoring and policing.29 We don’t know if kids and teens are exempt from or subject to special protections in this realm.30
We know that schools and law enforcement agencies are relying increasingly on digital tools to effectuate and track discipline. A misbehaving student isn’t just sent to the principal’s office; instead, his data is may be sent to the court system to follow him as he pursues future opportunities.31
We know that law enforcement monitors social media and uses data from it in their policing.32 Information you share could implicate your kids in wrongdoing. It could also stigmatize them or otherwise make them vulnerable if you are engaged in lawful yet potentially unpopular activities.
There is an argument to be made that sometimes parental inclusion of kids in social media accounts of unsafe situations may play a protective function for the kids that outweighs any privacy harms. Parents who have live streamed hostage situations, for instance, might help law enforcement monitor and try to protect the well-being of the children involved.33 Although parental social media use might help provide a window into some tragic situations that require governmental intervention, it also appears that social media use might play a role in setting up a portion of those situations. If the hostage taker is a mentally ill parent, did the lure of momentary social media fame contribute to her decision to take those acts against her children?34
We also know that law enforcement is using social media themselves in new and surprising ways that can result in making your kids’ private data public. Take the sheriff in Ohio who posted pictures on Facebook of a woman and man passed out from heroin overdoses in the front seats of their car. The woman’s young grandson was in the backseat of the car. The grandson was also in the picture. The sheriff didn’t deidentify him. When asked to explain his decision, the sheriff cited the importance of raising public awareness about the tragedies of opioid addiction. He reasoned that no one would remember who the toddler was anyway.35 Unfortunately, the national discussion over whether his action properly balanced the child’s privacy with public safety has likely ensured that many more people will remember.
And we know that law enforcement and the broader justice system are increasingly relying on digital data collection and analytics to inform many high-stakes activities. These data-driven decisions may include where to establish a police presence, how long to incarcerate a convicted criminal defendant, and what terms to require if that defendant goes out on bail.36 Tech company employees have started signing pledges not to build certain types of law enforcement databases, like a Muslim registry.37 These types of statements reflect the growing reality of data-driven governmental action. But such declarations may be largely symbolic.
Governmental actors are unlikely to need to build new databases or programs to gain access to information that allows people to determine religious affiliation or features of identity that are sometimes related to religion, such as ethnicity. For example, school districts keep detailed records on kids and families. Under the current presidential administration, the US Immigration and Customs Enforcement (ICE) agency has increased immigration enforcement near schools. It’s not a stretch to think that ICE could ask schools to share digital data with its agents so that ICE could attempt to mine the data for information related to immigration status.38
Third-party tech providers also handle student and family data from schools because they provide services for schools. It’s also not a stretch to think that ICE could ask these third parties for such data.
How a given school or vendor would respond to these requests is not a given one way or the other. Either way, as a parent, such transactions would be largely if not completely out of your line of vision. Big Brother today is part of an extended family network that often makes you digital data offers you can’t refuse—because you never receive the offer in the first place.
We’ve reviewed how digital information about children might expose them to criminal or hostile activities, as well as how this information might be used by third-party decision makers to assess children’s merits and access to opportunities. Let’s think now about a more old-fashioned type of risk—interference with or harm to kids’ interpersonal connections and personal lives. People of all ages that kids meet now or in the future will go online and learn things about them.
This category shares some of the same concerns as the third-party risk category. It looks at the implications of lawful (or not clearly unlawful) use of kids’ digital data trails to make judgments about them. But it is distinct in its focus on how this data can affect nondigital interpersonal interactions that kids have in their youth or adulthood. In turn, these interactions impact children’s and teens’ reputations (both now and going forward) and their sense of self (kids’ understanding of who they are and will become). We’re done talking about Big Brother in the metaphorical sense. Now we’re talking about brothers, sisters, and all other types of humans in the actual sense.
Reputation is comprised of the narratives and expectations that others have of you. It’s about the narrative they see and hear, not the one you understand yourself. This perceived narrative is a key component of your relationships with others. Some aspects of it might be grounded in truth, and other parts might be extrapolations, assumptions, or even errors. Reputation is something you can cultivate or not, but either way, you will have one. It reaches people you’ve never met and may never meet. It also reaches people whom you actually know.
The key here is that it is an interpersonal story, not an arm’s-length, robotic, number-crunched transaction. We are moving from “What will the data analytics program used by the college admissions officer at a teen’s dream school make of the social media profile his parents built for him?” to “Congrats! He’s admitted into his dream school! What will his freshman roommate say when he sees the same profile?”
Let’s put ourselves in the shoes of this first-year college student. Maybe the admissions officer thought it was adorable that you used to dress up as Peter Pan and stage impromptu musical performances. But your new roommate is a jerk and makes merciless fun of your five-year-old self, and he has the video footage to prove it. Your parents’ Facebook privacy status is set to “friends of friends,” and his cousin’s girlfriend’s aunt’s dog-trainer’s assistant is your mom’s BFF. Even in the predigital age, jerks found ammunition for mocking people, but in that age, private, playful, childhood moments stayed protected in the past. Back then, your roommate would have been able to mock you only for things you did at school, such as puking in the stairwell or other collegiate missteps: “Baaaaaad night omg #pukeface #nofilter” with a Snap attached. That makes him an a#$^%^, although he might still look like a saint compared to what some eighth-graders did with the Peter Pan video when you were in the sixth grade. You were too embarrassed to tell your mom what happened about so she never took down the offending footage. Actually, your classmates’ parents were more vicious than your classmates were.39
Your peers aren’t the only ones who have access to the internet. The adults you encounter also make their way to Google, even if they get there typing with their thumbs on BlackBerries. So instead of having the opportunity to introduce your new boss to the young adult version of yourself, your presentation of your present-day self is filtered through digitally available information about your past selves.
For instance, the human resources (HR) department at your first postcollege job likely will tell your boss not to Google you. HR doesn’t want your boss to uncover sensitive information (like a blog post in which your parents referred to the times in your tween years when you questioned your sexuality) that you could later allege informed your boss’s attitude toward you. Your boss probably ran the search anyway. Humans are curious mammals. And Google is a resource that is impossible for our mammal selves to resist.40
Even in the predigital era, your boss would have been curious about you. But learning about your adolescent angst would have been difficult unless she knew your teenage self or knew someone else who had known you. And if she had known you or someone who had, she would have more robust context for understanding the information about your sexuality struggles. She likely would have some degree of affection or at least good will toward you from having watched you grow up or knowing someone who did. She likely would feel bound by more long-established norms around interpersonal interactions with children in brick-and-mortar communities.41
What do behavioral norms dictate when an adult who didn’t know your childhood self encounters this self for the first time in its digital record form? It’s complicated. That adult is looking at information about you that is out of date, not presented by you, and not contextualized. You will need to navigate the extent and impact of her knowledge as you get to know her.
Here, there is no clear set of norms. Should you assume that everyone will Google you before meeting you? Should you assume that everyone can find your parents’ Facebook profiles through social networks? What about Twitter? Instagram? Parenting blogs? You don’t know what they’ve seen or haven’t seen. You don’t know even how to bring up that question because although using the internet to learn about someone is widespread, it would strike an awkward note if your conversational gambit was something like, “You probably know that I wet the bed until age ten because of my mom’s blog post about it.”42
Instead, you are left with a free-floating sense of unease. You know that your reputation will be shaped significantly by the digital data that exists about you. You can’t know what all this data is or who has access to how much of it and when. You have no reliable, meaningful way to have input into how these data points are integrated into the stories that other people develop about you. Frustrating, isn’t it?
Now let’s go back to being our adult selves. We can’t handle the truth of the teen brain. But we do need to keep looking at the range of reputation-related acts in which we as adults engage. Perhaps more insidious than the “meeting a new adult” category are the ways in which parents and other trusted adults can use social media to mediate our children’s relationships with their other loved ones.
Think of parents who are locked in a vicious custody dispute.43 A mom might Instagram a picture of a Valentine from her six-year-old daughter with wobbly block letters that says “You’re my #1 parent.” But what if her former husband, her daughter’s father, sees it and is angry at both mom and possibly daughter? This mom may have told her daughter to write it to support her case for sole custody. But the daughter is put in the position of having to explain it to dad during one weekend visitation, even though she has no idea how dad saw the card. And if dad tends to have too many beers while watching the football game on Sunday, then he might not listen to her. He might see her as playing for the opposing team, and she might come to see herself as unworthy of being on his team, a negative dynamic that inflicts both immediate and long-term damage.
For the writer Pam Houston, our lives are shaped by the stories we tell ourselves about ourselves, but the stories we tell also can “put walls around our lives.”44 The stories that other people tell about us can do the same. It’s not only children who are affected by storytelling in the digital world. For adults, aspects of digital life seem to be re-creating, in perpetuity, some of the worst dynamics of middle school and high school.45 This pressure is likely influencing how we talk about our kids. Do we feel more compelled to share information about our kids and families because the digital world makes us feel as though we’re at our lockers again, waiting to see who’s cool and who’s not? (Do schools even have lockers anymore?)46
As adults, we’re asked whether we like a post or not like it. If we like it, we can like it, share it, or retweet it. For nuance, we also can laugh at it or get angry about it. This is taking the ethos, the mentality, the pecking order of adolescence at its worst and putting it in our pockets, homes, cars, and more. Fears that the popular girls are whispering about us in the locker room or the popular boys will lock us in lockers are enacted time and again with each swipe and status update.
Status update. Think about that for a minute. It’s not an “activity” update. It’s not a “thought or feeling” update. It’s a status update. In the offline world, we don’t tend to ask people “What’s your status?” in everyday conversation. We use it technically (as in marital status, occupational status, tax-filing status, delayed flight status) and colloquially (“Hey hon, what’s your status? Working late? Or can you pick up kitty litter because the Walmart digital delivery human duo got locked out?”).
But even in these situations, it tends to be precise, focused. It’s not a general “How’s it going?” And it’s certainly not a thoughtful “How are you feeling about the inevitable stressors and ephemeral pleasures of your day? What have you learned about yourself and those close to you? What meaning are you making out of your failures and frustrations?”
Status. We go here or we go there. We feel good or we feel bad. We get dopamine hits or we get dissed. We are sixteen again, secretly afraid that everyone will forget our birthday even though we know social media will remind them.
In some ways, our daily predicament is worse than Molly Ringwald’s in the 1984 homage to adolescent oblivion and redemption, Sixteen Candles. Everyone she knew forgot her sixteenth birthday! But it’s not like they had digital reminders everywhere, so she could forgive her nearest and dearest for their memory lapses. Plus, the hot guy with a heart of gold finally found her! Screw everyone else. She was seen by the one person who mattered, at least for that moment.
We are all Molly Ringwalds now. We “dwell in [the] Possibility” of rejection and recognition.47 Such is our status pretty much all the time, although we try to maintain perspective. We enjoy those moments when Throwback Thursday makes us just the right dash of nostalgia, offering the perfect swirl of milk into the cup of tea of daily life. We can ignore those moments when no one likes the pic of our cat curling up next to the teapot. Forget tea cozies. This cat is more like a tea kitty, am I right?! LOL?! Anyone? Anyone? Bueller?
The next time, we will post a picture of the baby next to the cat next to the teapot because everyone loves cat pictures.48 It will be like a twenty-first-century Mary Cassatt! Or a real-life illustration of “The Farmer in the Dell.” The farmer takes a wife, the wife takes a child, and they both take pictures of the child. They share those pictures and much else about the child, in part because it’s a form of social status currency. To the extent that we are using our kids like those stone-washed jeans we just had to have circa 1985, we should knock it off. Or should we?
That’s a hard question for each of us to answer for ourselves. We may try to answer it from the mature part of our selves, but we’re pulled in many directions. The simultaneous existence of multiple stages of self that digital life demands, this “never forgetting” functionality of the internet, is impacting adults too. Although our past selves weren’t born and raised online, they are being resurrected there. We’re lucky, compared to our children. Presumably, we are more established than they are. We’ve earned degrees, gotten jobs, been married, had children, and generally gotten along in our lives. We’ve also dropped out of school, been downsized, been divorced, had miscarriages. Status update: we’ve been there and back again. “Roads go ever ever on.”49
When we’re sharing our own adventures—whether which tea we brewed for elevenses/second breakfasts or which dragon we fought that afternoon—we’re the ones hitting Post. When we’re tracking how many steps it takes to outrun that dragon or jumping into our driverless car to speed up our escape, we’re the ones buckling on the watchband or seatbelt. We are consenting adults, hooking up our devices and putting out our own information. But we don’t understand exactly what data we’re sharing, with whom, why, and what they will do with it. The same general confusion over privacy policies, terms of service, and other parameters of our digital tech engagement arises when we’re transmitting our own data instead of our children’s.
The two harms, though, are distinct in origin, scope, and impact. Even though there are limitations to informed consent as an effective framework for personal data sharing in the digital world, adults should be allowed to choose what they do or don’t do with their information. Otherwise, the autonomy and agency principles that structure our legal system start to wobble. Have we identified the “X marks the spot” perfect point on the treasure map? Certainly not. There are many compelling reasons to think about digital privacy and related reforms here too.
But the reasons for thinking about reforms related to adult usage are less compelling than those that prompt us to look at how we share information about our children. The law allows adults to smoke, drink, gamble, sleep around, and engage in all manner of other behaviors that are of questionable benefit to personal and public health. The law does not permit our children to do these things, and it makes adults the chief guardians of their well-being. It’s not a perfect system, but as discussed below, there is a core of ethical, emotional, and pragmatic value in this framework.
We adults have a heightened legal and ethical responsibility not to @#$@ around with our children’s lives. We have greater latitude when it comes to @#$@ up our own. We also have had much more time on this planet than they have. We’ve had time to figure out ourselves, more or less, and time to build credentials and connections that will help us get to where we want to go regardless of what our digital dossier says about us. There’s less we can @#$@ up for ourselves than we can for our kids and teens. They have so much more road ahead.
We certainly can and sometimes do hit road blocks in our own digital lives that affect our everyday realities. A common way this happens is when we exercise our own misguided or poor digital judgment. We hit bumpy spots in our marriage when we respond flirtatiously to a high school girlfriend or boyfriend who messages us out of nowhere. These and other digital dalliances with people who “knew us when” may be more about wanting to find our past selves and view alternate visions of our lives than about being attracted to the old flames themselves.50
But the potential for these detours from our domestic routines to bring us into dangerously uncharted territory seems much greater now that we can transcend time and space with clicks rather than the cloak and dagger machinations of yesteryear. Whitman says we contain multitudes.51 Maybe the digital world is simply delivering on nineteenth-century transcendental promises of the infinite self.
Our digital lives also can hit the skids and smash through the guardrails into our “real” lives through other people’s digital choices rather than our own. We can fall victim to criminal, illegal, or unethical decisions by others, much as our children can. Some of those actions are identical, or nearly so, to those that befall kids and teens. We can have our identities stolen. We can be doxxed or trolled. We can be turned into pornography through the distribution of real intimate images or the creation of photoshopped ones.
When a digital transgression happens to us, the law regards it as a less serious offense than it does when it happens to children. The US Supreme Court has ruled that it is constitutional to criminalize the possession of child pornography, even if the person in possession did not create the images.52 These images are themselves a crime, the Court explained, rather than mere proof of a prior crime. It violates a minor’s bodily integrity for such an image to exist and be looked at. It is legally impossible for a minor to give consent to the creation or distribution of such images. In contrast, it is perfectly legal to take or share an intimate image of another adult with that adult’s consent. The crime arises when there is no consent.53
Unlike our children, we adults can legally give or withhold consent to a wide range of digital activities. Like our children, however, we often have limited control over mean-spirited or thoughtless choices made by others. An angry coworker rants about us on Facebook, taking public a private misunderstanding. A flighty friend Instagrams a drunken bikini pool party pic of us, circulating a moment that was supposed to stay in Vegas. A well-meaning doctor prescribes a “smart” pill, giving us a spoonful of surveillance to help the medicine go down.54
These and similar choices threaten our privacy. They also affect our existing and potential future opportunities. Does our boss believe our colleague and pass us over for a promotion? Does the uptight chair of the charity anniversary gala planning committee eject us for unbecoming conduct? That one might go in the “Who cares?” category, but others will not. Will our health insurance premiums rise if we find the new “smart pills” our doctor prescribed too bitter to swallow and stay away? What stories are other people and institutions telling about us based on what we do or don’t do in our digital lives?
Shakespeare admonishes us to be true to ourselves.55 Fair enough, but this is easier said than done with the parental creation of a childhood digital data trail that may affect the child’s developing sense of self. A four-year-old is unlikely to Google herself or even to look at her parents’ Facebook feed for her own image. However, even young children are growing accustomed to having their pictures taken, and requests to post or not post information may come at very young ages.
Tech vendors are also aggressively marketing to all youth, with “infantainment,” smart devices, and other types of tech and content.56 Many of these programs or devices collect data about kids from the kids themselves.57 Toddlers don’t yet have credit cards, though, so it’s up to parents to set the terms of young children’s digital access.
Parents thus shape kids’ digital sense of self, which in turns shapes kids’ overall sense of self, from a very young age. Some parents take this practice to a whole new level by monetizing kids’ stories in the commercial sphere, which is the topic of our next chapter.
Adults’ choices about kids’ digital data can intrude into the space of childhood and adolescence, shape those spaces, capture data from those spaces, and transform the data in terms of audience, purpose, and longevity.58 Childhood moves from being a protected time for play and exploration into a phase that is surveilled, tracked, and analyzed by countless third parties. Adolescence, already a period of tentative and turbulent transition toward autonomy and more personal responsibility, finds new limits on a phase meant for making new choices and making mistakes. Without having developmentally appropriate opportunities to try activities and take on responsibilities, youth find it difficult to develop an authentic sense of self. Without having opportunities to make and then learn from mistakes, youth find it difficult to become open-minded, flexible, and resilient.59 Knowing that they might find it difficult to move past their inevitable mistakes or foolish choices, youth may become overly rigid. They also may become more reckless than they would be otherwise. Either way, Neverland, interrupted, poses an existential threat to youths’ future selves.60
This is when a hand is raised and a voice from the back of the classroom asks, “So are you saying that you want the Lost Boys to stay lost the rest of their lives? Wouldn’t they be kind of pains in the ass?”
No and yes.
We do want Peter to be able to leave Neverland. But we want him to do so when the time is right and when he has the right reasons. We don’t want the pirates to push him out, walking-the-plank style. We don’t want the Lost Boys to mutiny or to follow Wendy blindly back to civilization. We do think they could be more helpful around the Wendy house: do they not notice that their dirty dishes are everywhere?
The transition from youth to maturity inevitably involves some loss. But all is not lost. Joni Mitchell hits the mark when she sings, “Something’s lost, but something’s gained, in living every day.”61 Naiveté is replaced by knowledge. Being rocked to sleep gives way to rolling on toward your dreams. There is no set map for when, why, and how to leave Neverland. Ideally, both the destination and the journey bring meaning and promote mastery. Hopefully, an innate sense of self is nurtured in childhood and is part of the gravitational pull that leads us on. Unfortunately, there are many reasons to think that the “tectonic shifts”62 of the digital world are shocking if not razing this identity core.
Is this the wrong warning to issue? Is it possible that the 7.0 earthquake we’re experiencing is less likely to blow open childhood and adolescence than trap people inside of it? Is technology infantilizing us to the point that we’ll heed Peter’s call and “never grow up, never grow up”? Some argue that tech companies are increasingly giving us robo-parents—digital products and services that “fall into the category of ‘things that I, a 25-year-old-man, wish that I could still get my mother to do for me.’”63
Building on this view, the argument becomes this: the digital world isn’t destroying childhood and adolescence but is letting childhood and adolescence cross traditional boundaries and reshape the landscape of adulthood. The real threat around youth today and for the foreseeable future isn’t that grown-ups will demolish what it means to be a kid. It’s that a certain childlike mentality will erode what it means to be an adult. We’ll abdicate personal responsibilities and let the robots and other digital tech figure out the hard questions and the easy questions alike for us. Then the only real grown-ups in the room will have artificial rather than actual intelligence, and we’ll conclude that we “really don’t know life at all.”64
This argument has some solid ground beneath it. One need look no further than the “bro’” mentality of much of the tech ecosystem to see a valorization of some of the less praiseworthy and principled aspects of juvenile existence.65 “Build fast and break stuff” sets a certain tone and arguably shapes the products and tech that result, although this entrepreneurs’ culture is the province of only a fairly small number of people. We can let the hoodies and suits duke it out, Sharks and Jets style, on a street corner far away from us. Let’s look at the rest of the map, where most of us live.
Increasingly, all of us grown-ups are living in a world of enchanted objects, adorable monsters that fly down the street,66 and cars that drive themselves. Have we crossed the great divide into adulthood or Willy Wonka’s factory? If we can’t quite locate ourselves on the frontier between maturity and make-believe, won’t it be that much harder for our children when they come of age?67 How are we going to teach them to take out the trash when there is an Oompa Loompa robot to do it for them? Already humanoids are programmed to respond to digital directions to bring the goods we order into our homes and put the goods where they belong. We don’t give them an allowance. Instead, they allow us to become ever more removed from the logistics of daily life.
The digital service industry and its human accomplices are more about execution than design, however. We still need to place an order, either manually or on an automated basis, to receive the goods in question. We still need to think about what we need and when we need it and share that specific set of data points with the appropriate digital service provider. Perhaps the rise of AI will allow us to teach a robot to be mission control for a household birthday party for Huck next week (take one mom brain and combine all the data from a calendar)—with the next generation of Harry Potter–style “house elves”68 seeing that the perfect birthday gift is selected, wrapped, and delivered automatically.
Some retailers are moving in this direction by combining digital technologies with human capacities to integrate a more convenient and customized shopping experience into your normal routine. These hybrid services are coming both from traditional brick-and-mortar stores as well as digitally based companies. They are offering and developing various combinations of digital/human mammal interaction.
For example, Walmart is already “testing new delivery ideas . . . like delivering packages inside customers’ homes and putting groceries away in their refrigerators” through a partnership with a “smart home” service provider.69 In this model, digital shopping will be combined with a human delivery person who gains access to your home through digital means: you will control and be able to monitor the delivery person’s home access through an app on your phone.70
Other companies are focusing their digital technology use on getting you the products you want rather than getting them inside your home. Popular clothing delivery services “curate” a shipment of shoes, clothes, and accessories based on your stated preferences, thereby taking some of the decision-making burden off you.71
But this is only a small step, albeit one taken in perfect alligator-print stilettos, toward an AI that can do executive functioning rather than function as an executive assistant. You still need to do a lot of personal data input. There is still some human involvement in making style selections. Indeed, that sense of getting a bargain-basement deal on an elite service is part of the allure. When there is AI that can organize our schedules—note that we have an upcoming presentation at work, figure out what outfit we need and when we need it, order the outfit, and have it altered and hung in our closet in time—then we really have stitched together a fix for that category of household administration that moms often supervise. And we’ll have put the magic wardrobe out of business before it ever began. Sorry, Aslan. Nothing personal. See you at the next VC pitch-fest.
When we have AI that approximates a mom brain or household mission control, will it render the human denizens of the house less mature? What will executive functioning AI or other emerging, ever-smarter digital technologies do to adulthood? AI undoubtedly will remove certain tasks and decisions from our regular repertoire of things that grown-ups do.
For instance, presumably you can’t have your robot give your consent to a marriage proposal that came out of a Tinder hookup, even if your robot has all the relevant data. But the impact of that change on some overall maturity quotient for the eighteen and over crew is a far more complicated equation. Just because some tasks and decisions have been accepted as part of what grown-ups do does not mean that these are the only actions and responsibilities that connote adulthood. We used to expect the lady of the house to churn butter or embroider. Are today’s women any less adult because they don’t do their own dairy or doilies? AI changes the equation somewhat away from a linear “We outsource X set of life tasks to Y” to “We outsource X set of life tasks and life decisions to Y.” But as long as we still have some degree of agency and oversight in choosing and teaching our robot staff, it seems unlikely that we will suffer from perpetual Panism to the point where we are no longer “adulting.”
Another key variable in this calculus of whether digital technologies might lead us all into a throwback further than Thursday is what the rising generation of adults will choose to do with the additional time that is likely waiting for them before too many more “revolving years” are through.72 If the answer is “play more videogames while my robot does the laundry, pays my taxes, and puts the kids in the driverless car that takes them to school,” then it’s difficult to see digital innovation as maturity-enhancing. If the answer is “get in my thirty minutes of daily cardio, spend more time playing in the dirt with my kids, and volunteer at the local soup kitchen” while the robot cleans up, then it’s difficult to see digital innovation as infantilizing. To be mature adults, do we need to do our own dishes by hand? Could we instead simply take responsibility for ensuring that our dishes are done in an efficient, equitable, nonexploitative manner? No, Peter, leaving them out for Wendy to do doesn’t count.
Whether our kids grow up to do their dishes by hand or by robot, they will need a robust sense of self to engage these and other questions about what it means to be an authentic adult. And the questions won’t keep for eighteen or twenty-one years. At every step along the way, some iteration of “What does it mean to exist meaningfully at this point on the journey?” presents itself. What does it mean to be a capable eight-year-old? Do you need to know how to ride your bike? Is cursive optional or optimal? What about a truly sweet sixteen? Car wheels have replaced cartwheels and also bike wheels. Should the guy with the coolest car be your new best friend, or should you still be riding shotgun in Archie’s old jalopy, sticking with the friendship that is as reliable and comfortable as the car itself?
Without space in childhood and adolescence for unencumbered experiences, it is difficult to lay the foundation for an authentic self. Without a sense of self at your core, it is difficult even to begin to answer all the questions, never mind reaching any conclusions.
There is a seismic disturbance at the center of our beings being caused by the digital tech disruption of youth. And yes, there are also fissures arising in our grown-up landscape from the digital tech infantilization of adulthood. But the damage there is surface level, not structural.
As between the dueling threat models of adults to childhood and aspects of childhood to adulthood, the first is the bigger threat. Adults’ decisions about children’s digital lives are disrupting childhood such that individual youth and the life stage of childhood itself might not recover. Digital tech does enable some encroachment of childhood’s “Let mom take care of it” attitude into adulthood. However, adults who have been able to cultivate a sense of self-understanding, self-efficacy, and internal accountability will know how to seize the constructive opportunities digital tech affords and decline the destructive ones.
Are all children given an equitable opportunity for such personal development? Our country has heartbreaking inequalities in childhood opportunity73—poverty, abuse and neglect, sexual assault, discrimination based on identity including race and sexual orientation, discrimination based on immigration status, lack of access to health and mental health care, lack of access to dental care, weak education systems, parents and caregivers caught in opioid and other addictions, gun violence, bullying, environmental toxins.
These and many other deep structural problems deprive too many children of a secure foundation from which to begin their lives. The deprivations arise in countless ways. Notably, the risks from engaging in activities that could run you afoul of law enforcement are heightened for kids and teens of color. Most pressing: the consequences are lethal. They are also developmental. Kids and teens need to be able to explore a range of choices and even make mistakes in order to learn and grow.
Is it a fake gun or a real gun? Twelve-year-old Tamir Rice never had a chance to answer this question for himself. Tamir was playing in a city park when he was killed by a Cleveland police officer. When his fourteen-year-old sister ran over to him after the shooting, police “tackled her to the ground and put her in handcuffs.”74 The gun was fake.
Tamir’s death is one of many. Too many kids and teens of color, especially African American youth, die when their ordinary behavior is met with deadly force from law enforcement or private individuals. Even going to school is proving unsafe for children in families with mixed immigration status. Federal immigration officials are detaining immigrant parents when they take their children to school.75
For these kids and teens and their parents, it doesn’t matter that they are US citizens. It doesn’t matter that the US Supreme Court has made clear that public schools must educate all youth within their jurisdiction, even those who lack legal immigration status.76 Facing a choice between keeping their families together and keeping their education on track, many families with mixed immigration status are keeping their kids at home. Schools are moving from the heart of our democracy to a potential hell, through no fault of the schools themselves.
Sources of fear and loss in our public schools go beyond the current immigration policies of the federal executive branch. Significantly, gun violence, by both youth and adults, is killing both youth and adults. It’s also killing our collective faith in schools as a protected space. Firearm drills are the new fire drills. Sheltering in place, under your desk, is likely to be about as effective against this threat as it would have been against the nuclear bombs of the Cold War era. At least those nightmares never became reality.
So why should we advocate for an ideal of childhood and adolescence as the foundation for an authentic self if that prospect is out of reach for many? Doesn’t that minimize the obstacles and marginalize the already vulnerable among us?
The goal is to deepen and broaden our collective commitment to respecting and protecting childhood for all kids and teens. In its current form, that commitment is deeply broken. Rethinking our relationship to digital tech will not fix the fault lines that leave some kids with security and others with instability or trauma. Our kids won’t have cleaner water if we post less on Facebook. In fact, we might move less quickly toward collective goals if we remove entirely social media or other digital tools from our advocacy toolkit.77 But if we engage in rethinking, we might move more quickly toward such goals in a more privacy-protecting way that is healthier for our kids.
All such rethinking needs to take place with an awareness of the structural problems that plague us. Saying that adult use of digital tech needs to be reexamined doesn’t mean that adult choices in other areas don’t also need to be put under the microscope. In fact, a process of adult self-reflection and transformation toward greater protection of and respect for childhood in one sphere (digital tech use in daily life) will hopefully strengthen rather than weaken the odds of that process happening in other spheres as well.
Let’s turn now toward a sphere where parents are focused on the positive value of youth experience and are measuring that value in terms of money—“commercial sharenting.” On first glance, this “commercial sharenting” sector might seem to be galaxies away from the disheartening set of societal failures that are leaving children hungry, abandoned, and dead. Monetizing children’s and families’ experiences turns on making those experiences so appealing to viewers that marketing, sponsorship, and other dollars follow.
This process results in a lot of fairy dust and sparkles, literally and metaphorically, being put on display. But it also results in a surprising and, at times, disturbing display of the pirate side of Neverland. Commercial sharenting content can get dark. It can get predatory. People are watching. Let’s take a look at what they’re seeing. And let’s take a look at how the commercial sharenting sector reflects truths and trends about adults’ digital use of children’s private lives even when no money explicitly changes hands.