Tom Sawyer was a creation of small-town frontier life. Our Tommy S. shares the spirit of the original—a scamp with a heart of gold. But that’s where the similarities end. Tommy S. exists on the frontiers of cyberspace. He has digital data in his DNA.
This chapter tells Tommy’s story, a fictional yet true-to-life portrait of childhood and adolescence today. This case study is designed to be representative, not universal. It aims to surface key modes of sharenting, some of which you may see in your own lives, some of which you may not. You may also see sharenting in your lives that you don’t see mentioned in these pages.
As you read, think about what seems familiar and what seems foreign, what seems straightforward and what surprising. Do some of the sharenting activities seem inevitable, inherent in life today, while others come across as more discretionary? Where on the “Three Bears” scale would you rank the sharenting in Tommy’s life: too much, too little, or just right? What are the reasons for this ranking? Might you be inclined to assign different rankings at different times of Tommy’s life?
The stages of Tommy’s young life have a parallel digital sequence that tracks his development. The same is true for the children in our own lives. During each of these stages, we parents, teachers, and other adults are likely to share digital data about kids and teens that create a “digital dossier.”1 These three stages are creation (which includes conception, gestation, and infancy), education, and maturation.
This chapter treats each stage in sequence. This mapping is a rough guide to the landscape of adult decisions about kids’ digital data. There are two important caveats: (1) no map, including this one, could capture all conceivable actions that adults as a cohort could take, and (2) the actions taken by individual adults will vary considerably.
Why the caveats? Tech providers and tech users innovate at lightning speed. Individual users have their own preferences and patterns. Any list that attempts to capture all the instances in which all types of children’s data are shared by all adults through all available and emerging digital technologies would be obsolete in a nanosecond. Adults’ practices around the digital transmission of children’s data involve sharing many types of information through an ever-growing array of digital products and services with a wide range of individuals and institutions for countless reasons.
The types of information include, but aren’t limited to, medical, educational, social, behavioral, and psychological. The types of digital products and services involved include, but aren’t limited to, laptops, smartphones, tablets, social media, text, email, sensors, and “smart” devices. Products also include the doggie drone, which supervises your kid when he walks the dog and scoops up the poop in case your kid forgets. Okay, that canine companion isn’t on the market yet,2 but you can get your child a robotic dinosaur instead of a real dog for company.3 And you can also get a digital dog treat dispenser linked to an app so you can play with your dog while you’re away from home.4
Before Tommy is born, his parents have a real dog, and they want a real child to keep it company. They are older when they start trying, so they have trouble conceiving. Tommy’s mom’s well-meaning obstetrician recommends that she start using a fertility app to track her menstrual cycle and advise her on the best times to have sex. The app also advises Tommy’s dad on where to buy flowers and when to send them.5 When the app doesn’t work fast enough, Tommy’s mom adds a fertility tracking bracelet.6
The result is Tommy.
The app and bracelet predict Tommy’s creation. The United States Supreme Court has told us that it’s not its business to decide when life begins,7 but tech companies are rushing in where the highest court in the land and others fear to tread.8 Although the jury may be out on when life begins, the verdict on digital life is unanimous. Digital life can begin before conception.
When conception does occur, Tommy’s proud parents-to-be announce it on Facebook. They follow up with all the breaking news on gestation. His first ultrasound pic pops up in the newsfeed of tens of thousands of adults around the world. His parents have their privacy settings set to share with friends of anyone tagged in the photo. In addition to tagging themselves, they also tag their parents and siblings: those proud future grandparents, aunts, and uncles.
A friend of Tommy’s aunt is an ob-gyn. She notices what appears to be a minor abnormality on the scan but decides not to write a comment on the post, figuring that it would be inappropriate. She also decides against reaching out to the parents via private Facebook message. After all, the happy couple is trying to share good news, not crowdsource their prenatal care. She’s not their doctor. It’s possible she was reading the ultrasound incorrectly. An iPhone picture of an ultrasound printout posted on Facebook and viewed on another iPhone screen while she’s getting her steps for the day on a treadmill isn’t a hospital grade presentation. She has hundreds of her own patients to worry about, and she’s on call tonight. It’s best to get back to her workout before the next patient goes into labor.
Even though this Facebook viewer has kept her distance from the ultrasound data, Facebook hasn’t. Facebook is all up in that womb. Under its privacy and related policies, Facebook can use the picture’s data in a virtually unrestricted fashion.9 Tommy’s name isn’t on it when it’s first posted. But as soon as he’s born, his parents post his newborn mug shot, complete with his full name, date of birth, length, and weight. A handful of rogue hospital employees do the same, tagging Tommy as a “mini-Satan,” and causing a @#$#-storm before Tommy can soil his first diaper.10
Facebook and its associated service providers can likely connect the dots from the ultrasound pic to the newborn pics and start aggregating data about Tommy, including in its facial recognition database.11 Attempts to deidentify the data prior to aggregation should be viewed with some suspicion. Experts have found that deidentification is not always an effective means of protecting privacy from tech vendors, data miners, insurers, and other third parties.12
To be fair to Facebook, there are some pictures of kids and teens that it does not want its users to see. According to internal company guidelines on content moderation obtained by the media, Facebook will remove “imagery of child abuse if shared with sadism and celebration.”13
Outside of those narrow categories where Facebook or other social media companies consider content offensive and subject to removal or other action (whether based on their own decision making or legal requirements), parents are left to decide for themselves whether to share a milestone with their digital social circles. There is no Parental Social Media Association of America to give binding parental guidance to parents.14
Tommy’s parents record his first bath. They post the pictures and get a ton of likes, which encourages them to take and share more pictures. They back up their photo library to storage space on a cloud-based server, which is helpful when they drop their phones into the tub while bathing Tommy. It’s less helpful when their storage space is hacked and tubby-time pics float into unknown waters. They figure that the hackers are more interested in naked celebrities than bubble-covered, half-naked newborns,15 so they let the missing pics be water under the bridge.
Rub-a-dub-dub, in and out of the tub, Tommy’s parents are members of the digital monitoring club. They watch his every move on Nest Cam.16 They have a scare in the middle of one night when they think a hacker has gotten into that digital baby monitor feed as well.17 It turns out that Tommy was just snarfy. They track his sleep patterns with the Owlet bootie, a sensor-enabled sock that monitors infant breathing, sleeping, and other physical patterns.18 They use a nanny product with artificial intelligence (AI) to help respond to and soothe him when they are unavailable.19
They also continue to share news of his activities on social media with exposure to thousands of their nearest and dearest. They use a free digital service to make baby books.20 These collections are only for the viewing pleasure of Tommy’s grandparents, aunts, and uncles—and whichever human or machine eyes use the images for whatever purposes now or in the future. Well before Tommy takes a single step, his digital data travels to thousands, likely tens of thousands, of human and machine users.
Two-year-old Tommy is obsessed with Sesame Street, so smart Elf on the Shelf flies to the North Pole and grabs a smart Sesame Street denizen for Christmas. Okay, smart Elf on the Shelf can’t really source directly from Santa’s workshop. Tommy’s parents didn’t put in its batteries correctly. Also, this particular product doesn’t exist yet. Neither does smart Elmo. But he isn’t a Christmas myth. There could soon be an Elmo available that says more than “Tickle me.”21
Tickle Me Elmo is so twentieth century. For the twenty-first century, we need smart Elmo. Smart Elmo says, “No tickle.” Smart Elmo want to read. Smart Elmo want to do math. Smart Elmo want to be Tommy’s friend and give fuzzy Elmo snuggles. Drawing on the cognitive computing technology that drives the IBM Watson machine, smart Elmo would take the Sesame Street lessons that have provided an early education foundation for generations of children and bring them to life.22
So instead of learning his ABCs from Elmo’s interactions with the Sesame Street gang, Tommy would practice his letters under Elmo’s fuzzy tutelage. Elmo would show Tommy how to get to Sesame Street without leaving his parents’ home. As the AI technology underlying smart Elmo grows more sophisticated, instead of watching Elmo play with other kids, Tommy would play with Elmo. But what might Elmo be learning and sharing about Tommy?
Strong, easy-to-read, and fair privacy policies need to be in place for any smart toys or smart teachers to avoid having children’s data be used for an unspecified set of purposes over an indeterminate length of time. When Tommy gets sick and throws out Elmo until the fairy rescues the stuffed toy, Velveteen Rabbit–style, Tommy can also play with smart Barbie, a tracking teddy bear, and more.23 He can go old school and play with an app: “apps meant to appeal to toddlers and preschoolers are both the most popular [type of app] and the category that has experienced the fastest growth, according to a 2012 study.”24 Tommy can also chill with Siri, Alexa, and the other home assistants that finally learn to decipher toddler speak and play Sesame Street on command.25
When Tommy does leave the house to go to daycare, his parents get real-time updates from the daycare provider on a childcare app.26 The pictures from the provider include Tommy playing with the other kids. His favorite seems to be a little guy named Huck.
You may be wondering: are all of these digital data choices made by Tommy’s parents, or are some of them Tommy’s? Tommy doesn’t choose whether or where he goes to daycare. Tommy does decide whether to play with smart Elmo or the tracking teddy bear. Tommy decides whether to smooch Elmo or hit him in his red furry face.
This can be a tricky line to draw. Sometimes, adults’ choices around digital devices and services are about their own actions—whether to share a pic on their own social media page. Other times, adult choices directly or indirectly facilitate the subsequent choices that children and adolescents make, especially for young kids who can’t express a real preference about tech decisions. The toddler can’t go out and buy a smart Elmo on his own.
Other instances of adult facilitation discussed below highlight the role that parents and other decision makers play in allowing schools, camps, and other youth-serving settings to store, share, analyze, and otherwise use kids’ digital data. These institutions will receive some of the data from kids’ own interactions with digital devices and services. But kids are in the position to share this data because of tech choices that adults made first.
Tommy has his parents to thank for his tracking teddy bear. He’s also got them to thank for starting to build his education record while he’s still in diapers. In one sense, as Oscar the Grouch might point out, that proposition falls into the “There’s nothing new under the sun” category. The early childhood years have long been understood as foundational for educational development. The new revelation is that these early experiences in the home that previously would have been recorded in parents’ memories and scrapbooks are now digitally preserved and used by one or more tech providers, their affiliates, or other third parties outside the home—often without parents’ full knowledge.
When Tommy travels from his own personal Sesame Street to the public elementary, middle, and high schools down the street, he continues to be immersed in a connected digital world. Some of these digital “educational technologies” or “ed tech” are used by Tommy himself in the classroom or other school spaces.27 This decade is saying “Open sesame” to the floodgates of ed tech. The volume, types, and purposes of available digital products for student use in school are staggering. In many school systems, ed tech adoption has been rapid and widespread. Often, it is happening in an iterative, bottom-up way that brings a touch of that Silicon Valley “move fast” spirit into the more slow-moving world of public primary and secondary education.28
Many ed tech entrepreneurs try to introduce their services directly to classroom teachers or the other front-line decision makers in a given educational sphere. This is more of a side-door or back-door approach than front-door outreach to a schoolwide or districtwide decision maker, like the principal’s or superintendent’s office. Going through the side or back door may facilitate ed tech use as teachers and other staff quickly roll out the welcome mat for offerings they deem valuable to their students. Sometimes teachers become ed tech entrepreneurs themselves, which also can provide a window of easy access into classrooms.29
The protocols and formalities of the front door tend to slow down this decision-making process. But this delay can have a protective function, not just a pain in the ass one. If the front door is guarded by decision makers with a combination of technical, legal, and pedagogical know-how, a little delay may go a long way to creating meaningful educational experiences while protecting privacy.30
Even though Tommy is engaging directly with educational technologies, many of the devices and services that he uses are chosen for him or assigned to him by teachers or administrators, especially when he is younger.31 Some of them his parents know about ahead of time because the school sends home a release asking for permission to transmit Tommy’s personally identifiable information (PII) to the digital vendor that provides the service.32 But many others his parents will not know about, unless Tommy happens to mention how cool it is that a classmate liked his progress in an app the whole class is using.
He learns to read using a personalized learning platform on the iPad that is assigned to him through a one-to-one device program.33 The program allows him to progress at his own speed rather than be stuck on a one-size-fits-all plan for the whole class. Tommy likes the iPad. He is already very familiar with the device because his parents toilet-trained him using the iPotty, which allows him to play while he poops.34 He pays for school lunch using a swipe card linked to a web-based portal through which his parents can put funds on the card.35 Tommy’s parents sometimes get confused because the website they need to visit to reload his cafeteria card is different from the one they need to visit to see his grades.36
Tommy gets confused too. In addition to his cafeteria card, he has another card that uses a sensor to track when he gets on and off the school bus. He participates in a physical education program that uses a smartwatch to track his fitness metrics.37 When he starts high school, he is given a school-issued laptop that he is required to use at school and encouraged to take home for his work there.38 He goes online through his laptop to learn the history of the American frontier through a massive open online course (MOOC). And when he gets sent to the principal’s office for trying to fake his own death so he could attend his own funeral, he is assigned social-emotional education modules from a software program as a behavioral intervention.39
There is a seemingly never-ending stream of digital resources to create learning experiences for Tommy. In many ways, he is lucky: his school has the financial and other resources to bring digital technologies to classrooms and other school spaces. Although most school districts in the country appear to be using one or more types of digital ed tech,40 a “digital divide” remains in terms of access to and integration of these resources. Even within a district, the types of digital devices and services can vary considerably.
Some of them are old-school, like websites that Tommy visits from a computer that lives on a desk in his classroom. Mavis Beacon no longer teaches typing, but she’s got many disciples.41 A growing number of digital educational experiences are a post-1999 party. They rely on the cloud,42 sensors, the Internet of Things, artificial intelligence (AI), and other emerging technologies. Some of them are designed specifically for schools, like a sensor-enabled card for attendance. Others are designed for a general audience, like a MOOC, and then integrated by teachers or other staff into the brick-and-mortar school setting.43
Within a school system, different ed tech types will be used by people in different roles for a range of goals. In addition to the ed tech that Tommy interacts with, teachers and administrators who work with Tommy during his primary and secondary school years also use ed tech to support their professional responsibilities. Tommy has little to no awareness of this ed tech sphere, and even his parents may not have any understanding of this space, despite the notifications about some of these ed tech choices that they receive from the school. The principal’s office tracks attendance with a software program. This program sends alerts to Tommy’s parents if he comes late to school or doesn’t come at all.44 The nurse’s office keeps electronic health records.45 The teachers use an online grade book to track assignments.46 The art teacher creates a public-facing Facebook page with pictures of students’ art.47 The guidance counselor uses a predictive analytics program to assess Tommy’s educational and career trajectory.48 Do his talents lie in whitewashing fences or whitewashing the truth?
Preteen Tommy starts to test the limits around him, through fibs and more. He starts getting into trouble at school. The school’s digital surveillance system sees him cutting class and smoking in the woods.49 He gets referred to the juvenile justice system and gets a digital rap sheet that way.50 Some of the trouble is a result of his own digital choices. In one of those adult logic moments that exasperates teenagers, schools sometimes promote science, technology, engineering, and mathematics (STEM) in one breath and, in the next, breathe fire over minor digital tech infractions. Notably, New York has wrestled over how deeply the fruits of the tech tree should take root in schools. The New York City school system has been engaged in a protracted give and take over how much latitude students should have to use their personal devices in schools. A cellphone ban spawned new brick-and-mortar business opportunities for some corner stores, which started offering phone storage to kids for a fee.51
City mice aren’t the only ones trying to evade the trap of trouble over their personal tech. Their country cousins may also face a maze of contradictory tech signals at school. For example, in Manchester, New Hampshire, many of the old mill buildings that drove the late nineteenth-century economy have been rescued from their twentieth-century decay and are now home to a mini-Silicon Valley along the Merrimack River. STEM has some serious steam behind it. The high school is encouraging its students to jump on this tech train while also being vigilant that students’ personal tech adoption doesn’t go off the rails. According to a recent edition of the district’s student handbook, using a personal tech device in the high school is a level 1 offense that can carry serious sanctions. Three level 1 offenses add up to a level 3 offense. Other level 3 offenses include bringing a weapon to school. Level 3 offenses lead to out-of-school suspension.52 Out-of-school suspension separates teenagers from their phones and forces them to dial into studying. Problem solved.
Out-of-school suspension increases students’ chances of dropping out. It also increases their chances of being involved in the juvenile justice system.53 Even if the underlying offense of repeated personal tech use in school doesn’t violate criminal law, being out of school and likely unsupervised increases the risk that kids and teens will commit offenses that do. Also, in today’s “zero tolerance” school culture, it’s plausible that a student’s refusal to comply with any school policy could lead to an arrest based on a disorderly conduct or similar charge.
Let’s say that teenager Tommy won’t stop texting his bro, Huck, about Becky Fletcher, the hottest girl in school. The teacher looms over him and says that she really means it this time: “Put that phone away this minute.” Tommy is embarrassed. What if the teacher saw what he wrote? Is there any chance she’d believe that “I heart those titties” actually means “I chastely desire Becky for her superior intellect”?
He shoves the phone inside his desk so hard that his school-issued iPad falls out and hits the ground. Is it broken? Tommy is nervous. What if he broke the iPad? That @*#( is expensive. If his parents get a bill for it, they’re not going to be able to pay it. He doubles down on his conviction that this mess is all his @#*($@#* teacher’s fault. His teacher feels the same way Tommy does. The principal is going to run her out of town on a rail if a broken iPad turns up in her room. That @*(#* is expensive, and the big grant that the school received to buy the devices isn’t going to be renewed. This is all Tommy’s fault.
The following exchange ensues:
Go to the principal’s office.
I’m calling the school resource officer unless you go right this minute ago.
Don’t you mean “right this minute”? How could I go anywhere “right this minute ago”? I ain’t magic.
“I’m NOT magic.”
Yeah. I know. You’re DEFINITELY not.
We’re not talking about me. We’re talking about you. You’re NOT magic. NOT. Not “ain’t.”
If I ain’t NOT magic, then shit . . . that makes me magic!
Tommy’s principal disagrees. Tommy gets a Saturday in-school suspension. His infractions and accompanying consequences are recorded in the digital database the school has for disciplinary purposes.54 He is also assigned social and emotional learning modules on the importance of following instructions and refraining from profanity. His performance on those modules is recorded by the software program. Performance metrics include his actual answers and the data surrounding his answers, such as how long it took him to come up with each answer. Tommy’s personal answering speed is less time than it takes to whitewash a fence by yourself and more time than it takes a crew of others to do it for you.
Tommy learns lessons about himself and the world outside of school as well. He attends camps, after-school programs, and community sports and activities. These informal learning spaces seem to share with their formal learning space counterpart, schools, a trend toward expansive ed tech use.55 Tommy goes to an engineering camp where he builds a robot out of egg cartons and other recyclables. He learns how to make it move using a sensor kit and a connected app. His counselors post pics on social media. Tommy decides to start his own YouTube channel to showcase his science experiments, with a prank twist. “The Adventures of Tommy S.” becomes an instant classic—especially the episode on how to teach a robotic dog to ride a raft in a backyard pool.
Everything in the broad and ever-expanding category of ed tech with which Tommy engages is capable of capturing and storing vast amounts of data about him.56 The extent to which each product or services does so will vary, often without Tommy or his parents being aware of the variation. Many of these digital services and devices also can analyze and share this data with third parties.57 Sometimes, this sharing will be obvious, like a public-facing Facebook page. Sometimes, this sharing won’t be evident, even to the users themselves. For example, when Tommy swipes his card to enter and exit the school bus, does he really know who is seeing his travel data and why they care? Do his parents know?
The company that supplies the sensor cards and accompanying software to the school district is likely using third-party providers to supply services such as data storage. If there are weak terms of service or poor or no contractual language in place between the company and the school, the company might also share the data with a data brokerage service or consultant who specializes in advising transportation companies of market opportunities.58 The company might purport to deidentify Tommy’s data, but data deidentification is not always a silver-bullet solution for protecting privacy.59
As we’ll discuss later, there are federal and state laws that limit schools’ authority to share information about their students. Unfortunately, these laws are often unable to establish data-sharing parameters that promote students’ success, encourage educational and technological innovation, and keep privacy pitfalls to a minimum. Tommy may only have to walk down the street to get to school. But his data may travel far and wide, which poses potential problems for him in the present as well as the future. Smart Elmo says, “That doesn’t sound very smart to Elmo!”
That’s not even the whole story, smart Elmo! Schools are a primary arena in which the adults in Tommy’s life are going to make decisions about his digital data, often without realizing what they’re deciding. As Tommy grows up, his parents, teachers, and other caregivers will make digital data choices in many other domains as well. Those other domains may be loosely grouped into four broad categories (a particular digital decision may fall into more than one category):
• Social communications with peers, including social media, contributions to parenting blogs, or the generation of other noncommercial digital content;60
• Familial, educational, or other interactions with the government;61
• Medical and behavioral interventions mediated by digital tech; and62
• Household and general life management through Internet of Things devices or other programs, such as a smart fridge to monitor the grazing habits of voracious teenagers or a smart thermostat to keep them from turning up the heat to sauna levels rather than putting on a @*(#@* sweater.63
Let’s now return to the broken iPad incident that landed Tommy in his own Breakfast Club Saturday morning suspension. Following Tommy’s refusal to obey his teacher right away, the teacher calls the school resource officer (SRO) for help.64 The SRO is a member of the local police force who is stationed in the high school. He arrives in the classroom and arrests Tommy for disorderly conduct and destruction of property. Tommy talked back to the teacher, and he made a scene in class. Also, the iPad won’t turn on when the teacher tests it.
When these juvenile charges are heard by a judge, Tommy’s court-appointed public defender puts the teacher on the witness stand and asks if the iPad was charged when she attempted to start it. The teacher doesn’t recall. No one is able to locate the iPad in question. The local prosecutor drops the destruction of property charge, but the disorderly conduct charge is found to be true. This case disposition is aggregated in a court database, despite the strict rules of confidentiality that are supposed to apply to juvenile cases.65
Quick question from smart Elmo: why is the outcome of Tommy’s case brought to us by the letter T for true rather than the letter G for guilty? According to the ABCs of juvie, when youth commit infractions that would be criminal if done by an adult, they are charged with delinquency rather than criminal offenses. The animating principle for the juvenile justice system is rehabilitation, ostensibly without the same infusion of other objectives of the criminal justice system, such as deterrence and retribution. Because the juvenile judicial process is not supposed to be focused on wrongdoing but on kids’ doing better, the question isn’t guilt or innocence. Instead, we’re looking for “just the facts, ma’am.”
Did Tommy disturb the classroom by repeatedly and loudly refusing to comply with the reasonable requests of a local government official (his teacher)? Yes or no? True or not true? There is a big exception to this “truthiness” approach: when youth commit certain types of violent acts, they may be charged as adults and found guilty with a capital G.
Fortunately, Tommy’s actions don’t land him in the category of teens who can be tried as adults. His parents, however, are dismayed at their son’s conversion from glimmer in their eye to juvenile delinquent and go online looking for advice and support. They post on social media: “Help! How do we reform our bad boy?”66 Tommy’s mom writes a lengthy Facebook post about how difficult it is to parent a teenage boy with a wild streak and hypothesizes that perhaps he has an underlying mental health disorder. Buoyed by comments from her extended network, she sets the post to “public.”67 The post is even more successful than Tommy’s YouTube channel and is viewed by thousands of people across the world. She thinks about starting her own YouTube channel, “Bad Boy Mom,” and getting commercial sponsorships from companies that sell things that moms of bad boys need, like apps to track your teenager’s whereabouts.
Then fate intervenes. Tommy’s mom and dad have been struggling with an addiction to prescription painkillers. For her, it started with a script her doctor gave her after back surgery. Tommy’s dad then decided to help himself. They have managed to keep their use from interfering with their lives until they can’t anymore.
Tommy’s parents pass out in the front seats of their car. Tommy is in the back seat, panicking. The police show up. While one officer administers Narcan to Tommy’s parents, the other officer snaps a picture of the scene, including Tommy’s face in the back seat—his worst nightmare. No wonder he was acting up in school.68 The officer posts the scene on the police department’s Facebook page. Later, when asked why he chose to share this private moment with the world, he cites the need to educate people about the evils of opioid abuse. He says that no one will remember that Tommy was there.
Tommy will remember that he was there because it is the worst moment of his life. Countless other people also will remember because the picture won’t ever go away. Tommy goes to live with Aunt Polly, who will be his temporary guardian while his parents are in court-ordered in-patient rehab. Aunt Polly is concerned that Tommy might continue to get into trouble. She starts researching apps to track his whereabouts and installs a digitally networked security system in her home so she can monitor what Tommy does when she’s at work.69 Unfortunately, it’s not enough to keep her from having more #teentroubles with him. Kids will be kids, regardless of who’s watching.
Tommy will likely never have the opportunity, as did Twain’s Tom Sawyer, to watch his own funeral and find out what his legacy would be. But Tommy and other twenty-first century kids have the opportunity, on entering early adulthood, to go online and see at least some of the legacy of their childhood. But there is a whole digital history of their childhoods that Tommy and his brethren won’t ever be able to see because it’s in the deep internet of data brokers and others.70 There is also a whole range of decisions potentially being made based on or influenced by their visible and invisible digital childhood legacies about which they will have little to no knowledge and over which they will have even less control.
Put yourself in Tommy’s shoes. When he is old enough to see the sharenting that has been and is being done about him, what do you think his reactions will be? Why? Do you think these reactions are justified or juvenile (in the pejorative, not descriptive, sense)?
Now walk that proverbial mile in his parents’, teachers’, or Aunt Polly’s footwear. What do you think the reasons are for their sharenting decisions? Do you see those reasons as being solid, suspect, or somewhere in between? Might you start to realize that you already have a decision-making schema you’re using to approach sharenting, even if you haven’t been using the terms sharenting or decision-making schema?
Your answers to these questions might depend in part on what you understand the short- and long-term risks of sharenting to be. To equip you to explore these and other questions more deeply, the next two chapters explore the main types of risks associated with digital childhood legacies and highlight the potentially positive opportunities that flow from these legacies.