Skip to main content

6. Drones and Growns: Navigating the Digital Era

Published onAug 26, 2019
6. Drones and Growns: Navigating the Digital Era
·

France is training eagles to attack drones.1 The eagles are good at it. In the twenty-first century, chivalry may be dead. But medieval practices are not dead yet. There is a lesson here, but it’s not as straightforward as you might think. At first glance, the lesson might be that older eras can conquer the digital one. Now take a closer look. The sky has plenty of room for both the machines and the birds. The lesson is that the layers remain. Eagles hunt drones now. They still also hunt small mammals. They don’t need an app for that.

All of us parents, teachers, and other adults who care for kids and teens are a lot like those eagles. We inhabit a new world filled with digital creatures. And we have the instincts to do the same fundamental things that our parents, educators, and other caregivers did for us.2 We feed, shelter, nurture, and train our young. We leave our nest to find food for them. We take out a second mortgage to put a new roof on that nest and then a home equity line of credit to send our big chicks to college.

Our challenge today is this: how do we train ourselves to take our parenting, teaching, and other caregiving instincts and adapt them to navigate today’s digital landscape? The lesson is this: we can do much more than we think we can. We can keep our own young out of the way of drone attacks, metaphorical and potentially literal, and set them up to soar.

Where have your instincts been taking you as you parent, teach, or otherwise engage with youth? Have you observed others around you proceeding in similar ways? Do you see key public institutions—such as public schools, legislatures, courts, executive branches, and regulatory agencies—exhibiting positive, negative, or neutral instincts around how adults should treat children’s private digital data? How about key private actors, including private schools, tech companies, and other business interests? Are there changes you would like to see in your personal life, our shared public life, or the private sector to chart a different course for the relationship that parents, teachers, and other trusted adults have to shaping children’s digital lives and the attendant current and future opportunities? Do you understand those changes as animated by big-picture principles, pixelated details, or somewhere in between? Asking these types of questions, regardless of how you answer them, makes you the proverbial early bird when it comes to recognizing and reflecting on sharenting.3

This chapter begins with a thought experiment so we can better explore our instincts and those of the people and institutions around us. Next, this chapter and the following one outline a “thought compass”4 to reorient our approach to our children’s digital lives. Childhood and adolescence should be valued as unique life stages that are anchored in play so that agency and autonomy can be developed through bounded experimentation—making and learning from mistakes.

This compass is not a complete to-do list for personal best practices, legislative or other structural reforms, or any other checklist. Rather, it’s a navigation device grounded in overarching principles.5 These two chapters do offer a few suggestions about potential reforms—some of which are new, some of which may be familiar—to illustrate how some of the compass principles might be implemented, but those are not the main feature. The goal of these chapters is to help us find our own and a collective true north for nurturing a playful, meaningful, and self-affirming coming of age in the digital era.

We have more building blocks at our disposal than the Darling kids did in their nursery. We have legal protections for the home and family. We aspire to rehabilitate rather than punish misbehaving youth. We have some understanding within the legal system of the need for kids and teens to explore and test limits, although not (yet) a robust understanding of a play-based paradigm. And we aim to have meaningful consumer participation in our capitalist markets.

As detailed above, these and related commitments are experiencing complication and erosion, but they are by no means gone from the picture. Digital content is perpetually mashed up. We need to get more comfortable remixing the legal and other principles that shape the institutional and individual decisions around parental and other adult disclosure of children’s data so we can respond effectively to scenarios like the one below.

Thought Experiment: A Near-Future Hypothetical Scenario

In this near-future hypothetical scenario, you’re helping your seventeen-year-old daughter finish her college applications. The applications require her SAT score, SAT 2 scores, AP scores, and Tyke-Bytes “personal capital” scores. What the heck is Tyke-Bytes? Siri tells you that Tyke-Bytes serves as “your child’s passport from her past into her future.” You ask Siri to stop reading the Tyke-Bytes sound bites and do some digging. The response: Tyke-Bytes is a commercial database that serves as a repository of childhood data and a clearinghouse into adulthood. Tyke-Bytes aggregates as much data about each child in the country as possible and then packages the data for purchase by different types of institutions and individuals. The most popular product is a set of scores that rates children’s likelihood of future success in a range of areas, including education, athletics, and employment.

Tyke-Bytes will share these “personal capital” scores with any individual or institution that pays for them, isn’t legally prohibited from having them, and demonstrates what is, in Tyke-Bytes’ opinion, a legitimate need for them. You and your daughter don’t need to do anything to have these scores sent. All colleges that receive applications from her will request and receive these scores from Tyke-Bytes at no cost to individual applicants. Tyke-Bytes does allow parents and youth age eighteen or over to opt out of having Tyke-Bytes collect and share their information. But the Tyke-Bytes website warns you that opting out risks your child’s future. “After all,” the perky chatbot in the “Click here for help” section tells you, “an applicant without Tyke-Bytes scores is like a car without airbags: you could take it for a spin, but why risk it?”

Tyke-Bytes doesn’t exist. Yet. But its potential existence is far from the “Clap now if you believe in fairies” scenario.6 Some services exist already that reduce youth skills in a particular domain to a number, such as the Universal Tennis Rating system.7 And a cross-cutting data-aggregation and analytics service that uses sensitive digital data about kids to generate scores across the board to inform decisions by gatekeepers about kids’ future opportunities is not far from what is happening already.8

For example, a recent study from the Center on Law and Information Policy at Fordham Law School on the student data broker sphere found that higher education institutions use data gathered by commercial brokers for recruiting.9 This data is sold broken down into specific lists or “selects” (“attribute[s] that can be used to filter a subset of a mailing list”), including “ethnicity, religion, economic factors, and even gawkiness.”10 Some currently available lists or selects include: “Home School Oriented Christian Families,” “Jewish Households with Children Nearing High School Graduation,” and “Rich Kids of America.”11 There doesn’t appear to be a score attached by the data broker. Yet.

The hypothetical Tyke-Bytes business plan would suffer from some holes, notably the lack of data in the specific areas that the law protects from disclosure, such as juvenile justice (although there are plenty of holes in those holes through which ostensibly protected data does get out). These holes are unlikely to tank the entire project.

What’s so bad about receiving a set of scores based on your childhood experiences? We have credit scores. Insurance providers assign scores using proprietary formulas to which we have limited or no access. Admissions offices for colleges and universities use models to predict student success and make admissions decisions. Judges and probation officers assign scores to predict future dangerousness and help set bail conditions.12 Painting with a broad brush, the practice of various institutions and individuals using data-driven predictions to inform their actions is well established.

Are we troubled by the prospect that a parent’s blog post about a toddler’s toilet training fiasco in 2016 might play a role in determining where that now toilet-trained teenager gets into college in 2031? Likely your answer is yes. Let’s reorient ourselves and our institutions using four principles: play, forget, connect, and respect. In order to mitigate the toilet training and similar threats to privacy and opportunity, the digital world needs to return to the internet’s more playful, iterative roots. It needs to set up a protected place for childhood play the same way we try to protect brick-and-mortar playgrounds and classrooms: experimental, iterative, inclusive, equitable.

To do this right, the digital world needs to be forgetful. It needs to let go of much of what it knows about our kids and teens in order for them to develop the autonomy and agency necessary for thriving youth and meaningful adulthood. And when youth are engaged in digital spaces and relationships, the people, businesses, and other entities with which they interact need to show them respect. If their data is going to be commodified, they are entitled to more agency as economic actors rather than objects.

Play: Making Room for Make-Believe, Mischief, and Mistakes

We may be the hypocrites our teenagers think we are. We use digital technologies to enhance our own lives without thinking enough about the impact that sweeping up youth into our tech use might have on them.13

This book is focused on the impact of our sharenting on our children. Briefly: many other dimensions of our relationships with our children are likely impacted by our digital tech use. We need to ask whether our bonds with our digital devices are interfering with the parent-child bonding necessary for healthy child development.

This isn’t a sharenting question. For this question, it doesn’t matter whether we are using our phones to take pictures of our kids (and then posting them online) or to pay parking tickets. It matters only that we are using a phone or other digital device. Academic inquiry into the developmental impact of adult digital tech use on parent-child and other adult-child relationships is still in its early stages. Notably, a study published in the journal of the American Academy of Pediatrics cautions that despite some positive potential from digital tech, “mobile devices can also distract parents from face-to-face interactions with their children, which are crucial for cognitive, language, and emotional development.”14 Presumably, a study is forthcoming that concludes it is better to direct the inevitable frustrations associated with parenting into a snarky text thread with your friends rather than be snarky toward your children.

Back to the hypocrisy allegations: we use tech to create opportunities for ourselves and to control young people’s digital and nondigital lives. For example, we encourage our kids and teens to use ed tech in their classes and activities in order to put down roots in STEM. Yet many schools have “zero tech tolerance” policies for students’ use of their own devices on school grounds, and a student caught texting too many times can wind up with an out-of-school suspension.15 We value “entrepreneurship.” We applaud start-ups that “think big” and “fail early and often.” Yet we are often intolerant of that same iterative process in childhood, even though it’s necessary developmentally.

We’re letting the grown-ups play and making the kids pay.16 We have it backward. The digital world needs a protected place for childhood to play in the same way we try to protect brick-and-mortar playgrounds and classrooms—by making them experimental, iterative, inclusive, and equitable. We can’t have play be the province only of those privileged enough to build private forests for their kids.17

So how do we head in the right direction? The near history of the development of our current frontier, the digital one, offers some inspiration. In its origins, the internet had a Wild West ethos that allowed individual participants, including kids, a lot of room to play.18 That playful spirit is alive and well, but the terrain has shifted. Increasingly, the Wild West ethos seems to be manifesting itself in a gold rush for data.19 Institutions in the private and public sectors have a “grab data now, figure out what to do with it later” mentality. That adventuresome spirit certainly brings some societal benefits with it. Technological, entrepreneurial, and other forms of innovation are powerful drivers for economic, educational, and other key areas of growth—but not if they come in the form of individual or institutional decisions that keep the gold for the grown-ups and give kids lumps of coal.

We need to let kids have their own Wild West. Protecting the frontier of childhood requires that adults make conscious choices to impose limits. We need limits around those types of childhood experiences we track digitally.20 We need limits around how we share and use the data that we do collect.21 Playing requires room to experiment within reasonable bounds, without too much attention from or accountability to others. There is a tension here: without bounds, play becomes more Lord of the Flies, less Peter Pan. But with too many bounds, play loses its essence of exploring, making mistakes, learning from mistakes, and then doing it all over again.

The focus of this analysis is on experiences that adults share or that adults set up children to share, like giving a toddler a smart toy. But this same principle of play would certainly apply to situations where older kids and teens are making their own choices about which data to share and why. Here’s a concrete example: a parent chooses not to use a toilet-training app with unclear privacy policies. We also need to limit the potential negative consequences from those experiences we do track. So the potty programmer flushes its old privacy policies and replaces them with a guarantee not to reshare toilet-training data with any third parties, including data aggregators.

There are plenty of other examples. A summer theater camp decides not to take digital pictures of campers until opening night to avoid making the teen performers nervous as they rehearse. When the curtain rises and the flashes go off, the camp puts pictures on its website only after the teens themselves and their parents give consent to publication of specific pictures. For its part, the admissions office at the local community college agrees not to Google applicants or, if it does, to notify applicants of any results that surface that give the office pause and give the affected applicants a chance to explain the content. That way, if a picture of Lady Macbeth from that summer camp production, dressed in black and holding a bloody knife, is made into a meme, it won’t raise concerns about the applicant’s mental stability. She’ll explain: “This is not a sorry sight. It’s just a spotlight on my future stardom!”

Social media and other tech platforms frequented by parents, teachers, and other adults could develop more features to encourage choices that protect play, such as a feature that asks, “Are you sure you want to post this about your child because it could have the following consequences?” We could also look to companies for parent versions of kid-focused platforms. We have YouTube Kids and new Google services for kids.22 But what about YouTube Parents or Facebook Parents: ways for parents to connect without these platforms tracking, aggregating, or otherwise using data about the parents’ kids?23 For example, Facebook could leave up a post from a parent about toilet training with the privacy settings that the parent picked but couldn’t pass that information through to third parties in any way or use information from it for its own internal market analysis or product development.

These are just a few examples of human-centered and tech-centered solutions to protect play,24 but implementation of the same principle could be pursued using laws or regulations. A legal toolkit seems best suited to regulating governmental, commercial, or other institutional conduct rather than monitoring parental conduct directly.25

For instance, the federal Department of Education could pass rules requiring colleges and universities that take any funds from the Department of Education to adopt some version of the “don’t Google or explain the Google results transparently” privacy-protecting approach outlined above. Government agencies outside of the education realm could also limit the uses to which they put private data about kids. An agency in charge of public benefits might commit to use family data to make public benefits decisions but not to engage in predictive analytics around any sensitive (yet arguably relevant to public benefits administration) life events, such as trying to determine which kids in a family might become teenage parents themselves.

We could think bigger than the rules or policies promulgated by individual federal agencies. We could have a federal law that prohibits all federal agencies, as well as state and local agencies that receive federal funds, from making sensitive predictions or decisions—around education, public benefits, and job training programs, to name a few—based on all or certain types of digital data collected about kids and teens, whether directly from them or from adults about them. For example, we could allow the use of data to determine the grade in which a child should be enrolled but not allow it to be used for third-party data analytic software that attempts to determine which children will be truant from school.

This type of federal law could also be written to apply to private companies engaged in interstate commerce. It could be broadly written or more sector-specific. Here is a valuable example: in 2014, California passed a law limiting what ed tech companies can do with the digital data they collect about students. Under the Student Online Personal Information Protection Act (SOPIPA),26 these restrictions on ed tech companies include a prohibition on using student data for “targeted advertising” or the “creation of a profile of data on an individual student unless the profile is ‘amassed’ for ‘K–12 school purposes.’”27 This state law aims to fill gaps in the existing federal legal framework for student data privacy by regulating companies that collect digital data directly (rather than looking to parents and schools to be gatekeepers for youth data privacy) and specifically delineating activities prohibited by companies that the legislature found to be threatening to kids and teens.

Congress could follow California’s lead for ed tech companies nationwide and prohibit marketing, profiling, or other high-stakes activities based on student data. Congress could go beyond ed tech companies and require the same of all companies that know or have reason to know that they collect digital data about youth—whether from youth directly or parents, teachers, and other adults on behalf of youth. Congress could also look to regulate earlier on in the digital data acquisition cycle and intervene at the stage of data collection, rather than waiting to the stage of data use.

As effective as it could be, the prospect of broad federal legal reform around youth digital data privacy seems as far away as the stars that guided Peter and the Darling children. In recent years, even more limited efforts at federal law reform around youth digital data privacy (for student privacy specifically) have failed.28 And given the tech sector’s focus on digital data as a profit source, the many other sectors (both public and private) that rely on digital data for various purposes, and our collective reliance on digital devices, comprehensive federal statutory or regulatory reform around youth digital data privacy will likely continue to be light years away.

At the moment, perhaps that’s just as well. Federal legislation and regulation can be cumbersome and overbroad, running the risk of stifling innovation by state legislatures, regulators, and other actors in the public and private sectors. Let’s play around ourselves. What would it look like to think about play as a positive motivator for legal reform, whether at the federal or state level?

So far, the suggestions we have explored around potential legal reform are protectionist. Privacy is used to protect childhood and adolescence as a space for play. Through creating virtual boundaries around these life stages—in terms of what can be done with information about those humans in those stages—we are enabling experimentation, the process of making and learning from mistakes. But protection is, as lawyers like to say, a floor, not a ceiling. Let’s blow the roof off the ceiling. How could law be used to promote childhood and adolescence as life-stages-grounded play in our digital world?

We could spend money, offering public funding to companies or other entities that create “digital content and services of social, civic, artistic, cultural, educational and recreational benefit to all children.”29 This language comes from recent guidelines issued by the Council of Europe’s Committee of Ministers on how its member states can “respect, protect and fulfill the rights of the child in the digital environment.”30 These guidelines recognize a “right to engage in play.”31 Although these guidelines are not binding on the United States, we could look to them—and we could one up them.

We could spend public money to incentivize the creation of these types of digital content and services that include both youth and their parents, their teachers, and other trusted adults. We could think about a massive investment in building the digital equivalent of neighborhood playgrounds or national parks. What would the new city-level Digital Parks Advisory Committees design?32 Could they build hybrid outdoor digital spaces where kids and adults alike could play in actual and virtual sandboxes? What would a future federal Digital Wilderness Act give us?33 Could a digital space be created that is secured for the online equivalent of hiking through pristine mountains: no advertising, no surveillance, no trace left of the experience? We have no idea. Yet. That’s the beauty of it: we can start to build it, and the playing will come.

In part, it will come because we will play in the process. Just as our kids do, we will need some room to experiment in the digital realm. These suggestions illustrate some ways the legal and regulatory toolkit could be used to foster youth digital lives that are protective of play and the benefits that flow from exploration, experience, and learning from mistakes. They are not the only, the best, or even necessary ways. We will doubtless have some false starts as we try to foster play in the digital realm. We will try again. We will have partners. New questions about digital life are giving rise to new collaborations on new projects. Many of these collaborative ventures represent what Urs Gasser, a leading scholar of digital governance, terms “multistakeholder” digital governance:34 different sectors joining forces to tackle complex, unprecedented, and rapidly evolving challenges. In a sense, there is an element of play—in a pure, noncommercial sense of the term—to this approach. Let’s dump our toys out together and see if we can make this pig fly.

It’s not all fun and games. Each sector has its own values, goals, and other structures. Sometimes, there is conflict between or within sectors. Sometimes, in order to achieve a certain principled outcome, one sector needs to have the final say. It’s important for the market sector to contribute to the fight against child pornography. It’s imperative that the government have the power to prosecute and incarcerate child predators. It’s crucial for leaders and employees in the market sector to pledge not to create a “Muslim registry.”35 It’s essential that the court system remain open to address the violations of core civil rights and liberties that would arise from a government attempt to contract out such a registry. It’s beneficial when the government makes grants to get new types of tech businesses up and running.36 It’s bedrock that the market sector produces based on private innovation and investment, within legal limits. As in many play spaces, sometimes one side is stronger than the other. Red rover, red rover, send the government right over. Psych! I want to play with the tech sector instead! Just kidding. No take-backs.

What is the team of grown-ups going to look like that helps put the play back in the internet? The eagles and the drones are available. Countless other combinations of players are as well. Much like the eagle and drone saga, it’s too soon to tell who will win this particular race. In part, the crystal ball is murky because it’s unclear what winning would look like. Would it mean that kids and teens are taught how to play so that they can monetize their creations? Would it mean that they are taught how to play without concern about entrepreneurship? Somewhere in between? There are normative value judgments at work in all proposed toolkits. Sometimes, these are explicit. Other times, they are implicit. When assessing attempted solutions, it’s important to think critically about what those are and whether you agree with them.

Many stakeholders are rushing to innovate at this intersection of the internet and play. Much like the eagle-drone match-up, some of these innovators are arising at the intersection of the digital and the traditional. For example, traditional brick-and-mortar toy companies are starting to think more outside of the proverbial shrink-wrapped box. In summer 2017, Mattel debuted a new figure: a chief technical officer (CTO). The good news: the CTO doesn’t need batteries. Now for the bad: you can’t find him in any store or app near you. The CTO lives only in Mattel’s headquarters, where he is in charge of creating “digital physical” experiences for children.37 At the moment, the CTO is rare enough to be a collector’s edition. You won’t find that he has many counterparts in corporate toy headquarters across the country.38 He’s unlikely to be lonely for long, though. Making and selling toys aren’t child’s play. The toy and game industry is a huge market and creative force. Toy manufacturers are realizing that they need to transcend the online-offline divide to engage children where the children dwell. A CTO can help lead the way to Interland.

Wait, where’s Interland? Why hasn’t Siri given you directions yet? Siri, you @#$# piece of !@#$@#.

Stop screaming. Everyone in the world will be able to hear you when your meltdown gets filmed and posted on YouTube.

You can find Interland courtesy of another recent adult addition to the collective project of reimagining the internet as a place for more and better child’s play: a new “Be Internet Awesome” curriculum from Google and the nonprofit iKeepSafe (iKS).39 It contains a lot of lessons that take place in Interland, a designation that evokes Neverland. The curriculum seems to present this realm as simultaneously real and make-believe. The real part is that kids can get to Interland anytime, anyplace, anywhere. It’s a real destination, a few clicks, swipes, or simply steps, if you’re taking the sensor or wearable route, away. The pretend part is that you go there to play games, tell stories about your and your friends’ lives, and generally get away from the “here” of everyday life.

It’s a fair portrayal. The digital world is a liminal space. But is it a space that should displace or even replace Neverland? Not the Disney version but that frontier of unfettered childhood play for which Neverland is a general signifier. Digital Neverland should respect the original. It should integrate principles of the original. It should not destroy or co-opt the original.

The Google-iKS curriculum is a massive undertaking designed to help teach kids how to develop safe and rewarding online lives. This is a map that needs to be drawn, with a destination that is broadly accepted as desirable. However, this and all similar educational ventures should empower kids to do more than navigate the digital world as they find it. Our children need to feel they can change the digital world: often, “young people themselves are the ones who are best positioned to solve the problems that arise from their digital lives.”40 Finding Interland is not enough. We need to make sure that kids can build a new world and that we can too.

At its core, play pushes boundaries. It creates new worlds. Sometimes these are internal. Other times, they are interpersonal, institutional, or virtual. A map can get you started. It can’t get you all the way there. Play on!41

Forget: Creating Clean Slates and Room for Reinvention

What happens when play fails? What happens when we’re not playing and we do something thoughtless, cruel, or embarrassing? Can we just sweep the mistakes into an old toy bin and forget about them?

In the early 2000s, New York City’s Metropolitan Transportation Authority put up posters in its subways that said more or less, “Sometimes you have to go backward to go forward.” This public service campaign was intended to make riders feel less grumpy about subway repairs, but the same basic concept applies to growing up. You have to go backward to go forward. You have to make mistakes in order to grow. You have to remember your past in order to leave it behind.

Kids and teens today do not have the good fortune of actual ignorance or benevolent amnesia on the part of people and institutions they encounter as adults. There is too much remembering and not enough forgetting.42 Today’s youth are the digital superstars of their own existence, from womb to dorm and beyond. They leave a trail, much of it not generated by them. Even as recently as the 1990s, the only computer-based trail young people were likely to have associated with them was their time playing the video game Oregon Trail. For Generation X and even some of Generation Y, the identity of your high school boyfriend might be something that only a few people in your adult life know. Maybe that data detail is enshrined in hard copy in your senior yearbook gathering dust in your parents’ attic. But only the moths know.43

To kick it old-school media style for a moment, think about Say Anything, a classic teen movie from the 1980s.44 When John Cusack’s character, Lloyd Dobler, holds up his boom box to blast a Peter Gabriel song to his beloved, no one live streams a video. Today, that iconic exchange would live far after the first love faded. And Lloyd probably would just AirDrop the song anyway. Or maybe Diane’s father, who hates Lloyd, would film the whole thing, put the video on YouTube’s “Jerks who want to date my daughter” channel, and then turn the footage over to the police for a trespassing prosecution. And then either that video would be played by the best man at their wedding in the “happily ever after” version, or in the version where they break up after high school, it would be played furtively by each of them.

The digital world remembers. It’s set up to remember. But it needs to learn how to forget the silly, stupid, and insensitive things kids and teens do, especially when adults are choosing whether to record them. We need to think more about how the digital realm could forget childhood. Right now, we’re thinking more about how to make sure all remembering is done consensually and safely. So we’re thinking about up-front consent by parents or protection from unauthorized third-party access. We do think about data destruction, but more as a security measure than as a principle of limited use.45

The European Union (EU) is doing more than thinking about forgetting. It is creating new, robust, and still evolving legal protections for individuals that empower them to require certain holders and users of their digital data to forget about them—if certain criteria are met. Under the new General Data Protection Regulation (GDPR), which went into effect on May 25, 2018, EU “subjects”46 have a legal “right to erasure,” more commonly called the “right to be forgotten.”47 Erasure is like taking one of those old-fashioned pink rubber erasers from school supplies circa 1990 to the digital data that a given data controller has about a person: under certain circumstances, following a request from that person, the controller has an “obligation to erase personal data without undue delay.”48

Among the many reasons this right can be invoked are that the “data subject has given his or her consent [to personal data use] as a child and is not fully aware of the risks involved by the processing, and later wants to remove such personal data, especially on the internet.”49 However, this basis for erasure does not appear to apply when that child’s parents shared the data themselves, or even when those parents gave consent for the child to share the data directly: “The GDPR has been partially inspired by COPPA [the US Children’s Online Privacy Protection Act],” which is a parental consent-based framework.50

For children under age sixteen and, in certain EU member states, under age thirteen, parental consent is now required in most circumstances before digital data can be collected from children themselves. It is unclear how much erasing young adults in the EU can do of data that their parents gave consent for them to share. And it does not appear that this right reaches data that was sharented about them in their youth because it was not their consent that was required to share it in the first place. It was their parents’ decision rather than theirs.

Even if young adults in the EU are able to force the forgetting of previously sharented information, young adults in the United States have no such legal right. In the United States, we don’t have any type of federal legal right to be forgotten for kids or anyone else.51 The law—on the state level—mandates such complete forgetfulness only when the juvenile justice system is involved. Those records are confidential, often remain sealed, and can’t be ported over into adult life in many situations.52 The rehabilitation rationale in the juvenile justice realm requires both real-time and future protection from disclosure. This protection allows rehabilitation to take place and allows the adult that the child has fathered to enter adulthood unencumbered by any stigma or other life-altering consequences of childhood actions.

The penalties for disclosure of juvenile justice records can be severe. It may be a criminal offense for a party to a juvenile justice case to share records from that case outside of the other parties.53 A post from Aunt Polly that reads, “Court today for Tommy. Tommy testified that Huck made him do it, but Judge still slapped him w/ two years in juvie for faking his own death. Here’s a copy of the court order #injustice” would be criminal conduct. But a post that reads, “Tommy faked his own death today. Damn Huck. Grateful for his return but want to wring his neck!!” would be fine. We find ourselves in a strange situation where juvenile misbehavior that leads to involvement with the juvenile justice system triggers the strongest possible privacy protection. In contrast, misbehavior that stays out of the system is entitled to little or no privacy protection.

The takeaway isn’t to get rid of the privacy protections for the juvenile justice system or to place even more situations into this system. We need to import the animating insight from this system (that learning from mistakes requires a clean slate going forward) to our treatment of childhood experiences more broadly. We need to explore whether and how children and teens deserve a “right to be forgotten” or “reputation bankruptcy”54 with respect to those pieces of private data that parents, teachers, and other adult decision makers capture, create, and use about them without their knowledge or consent.

There would need to be plenty of carve-outs for appropriate medical, educational, governmental, and other uses. Such a right wouldn’t require a doctor’s office to dump a childhood vaccination record. But it might require Instagram to remove a picture a parent posts of a child’s face covered in a rash, if the child reaches age eighteen and goes through an appropriate process with the social media company that posted the picture. Indeed, social media companies already provide a way for parents to request the removal of or impose limits around certain information about their children.55 It gets trickier, however, when we start to think about an eighteenth birthday as bringing with it a legal right to have content previously shared by parents, teachers, and other adult caregivers removed.

The First Amendment looms large here. What about the scenario where a child breaks from an evangelical Christian family in adulthood and requests that all family photos involving the child be removed from social media because the photos demonstrate the prior religious affiliation of the child? For the law to require a social media company to remove that post or to require parents themselves to take down the posts raises serious free speech and free exercise problems that the parents may well assert,56 even if the company provided prior notice about granting the removal right for young adults when they come of age.

The parents have constitutionally protected rights to their freedom of conscience and speech. These rights include the freedom to practice and talk about their religion. The Fourteenth Amendment joins with the First to supercharge those rights when it comes to childrearing.57 As discussed above, parents enjoy heightened protections of domestic privacy to guide their children’s religious upbringing.

However, parents can’t use religious freedom as a shield to avoid compliance with criminal law or most other laws of general applicability to protect public welfare and safety. They can use it as a sword, however, to pierce almost all barriers to private worship and public dialogue about this devotion. Taking their kids to an evangelical church is private worship. Taking a picture and putting it online transforms this private moment into public engagement about religious belief. It’s a lot like giving your kids a bunch of flyers to sell on street corners to spread your family’s understanding of the gospel. Such an arrangement is so last century and likely also illegal.

Why isn’t it a #1stamendment #foul for the government to make parents keep their kids off the street but not off the information superhighway? Is it reverse digital discrimination? On first glance, it might appear that the odds of a First Amendment victory rest with whether the speech is happening online or off. But look beneath the surface. Whether the road is pavement or fiber optic is irrelevant. What matters is whether the activity that is happening on the literal or digital sidewalks is legally defined as one that can be regulated under public welfare and safety laws.

Having kids stand on street corners to sell religious literature? That’s child labor, according to the Supreme Court, so state laws that regulate children’s employment to protect them from unsound and unsafe practices can be enforced. Enforcement may occur even when the labor implicates the free exercise of religion and other constitutional rights. When the aunt of a nine-year-old girl let her ward offer religious material for sale on the street corners of Massachusetts, in contravention of the applicable state child labor law, the US Supreme Court held up a stop sign: “Neither rights of religion nor rights of parenthood are beyond limitation. Acting to guard the general interest in youth’s well being, the state as parens patriae may restrict the parent’s control by . . . regulating or prohibiting the child’s labor, and in many other ways.”58 The aunt’s criminal conviction stood.

What about when kids are pictured online? Let’s return to our hypothetical evangelical family. Let’s say that the posted picture of the kids in their Sunday best showed them holding open copies of the New Testament, with this caption: “Know the truth.” The posting of that picture isn’t considered to be child labor under existing federal and state child labor laws. Posting the picture does not result in kids’ offering any items for sale or performing any service for compensation. There are situations in which a picture posted of a child online could implicate child labor laws. For example, when a child participates in a photo shoot for a fashion designer and the resulting pictures are put up online, child labor laws apply to regulate the duration and circumstances of the photo shoot itself. But in this and similar instances, child labor law is concerned with the circumstances of the acts captured in the picture, not the posting of the picture. Even though kids’ private data is being used by parents and other trusted adults to obtain free or low-cost digital services, the law does not tend to see transactions of this data as a form of child labor.

Forget the law. (Isn’t forgetting fun?) Let’s think instead about what social media and other digital tech companies might offer in their privacy policies and terms of use. As a matter of private contract between the company and the user, why couldn’t the companies reserve the right to remove any data an adult user transmits about a minor after that child turns eighteen, subject to the child-turned-adult’s request? Such provisions would put the adult user on notice of this reasonable limitation on its rights to control private data about children. The adult user could choose to use or not use the service with this understanding in mind.

You might hear Silicon Valley screaming: “This would take too much work!” (Insert angry emoji faces here.) Yes, it would take a lot of work. However, there would be plenty of ways to make it less work than it might initially seem. The companies would need to put a process in place for the requesting party to prove that she is the child whose data was shared. The companies then could limit the types of data subject to removal to content that might harm privacy or cause embarrassment to a reasonable person. The companies could require that the requesting party make a credible showing of why the data requested for removal falls into one of those categories. The companies could put some categories of data completely off limits for removal, such as data more than a decade old or data that has been irrevocably mixed in with the adult user’s data such that it would not be possible to disaggregate. Data picked up from smart home appliances, for instance, might well fall into that second category.

It seems unlikely that this type of private-market solution would be offered by companies. In addition to the legitimate concerns about workload, which likely could be mitigated through thoughtful tailoring, there is a more fundamental problem: where is the market cohort that would push for this new privacy or terms-of-use provision?

The companies could come up with some type of market-based partial solution to the problem of digital forgetting. But we adults are unlikely to want it. Even if we recognize that we may need to rethink our tech habits, we’re unlikely to push tech companies to give our future adult children a contractual right to revise our tech decisions. Kids and teens might think it’s a great idea, but they are still not the market force their parents are. And a legal framework that requires this type of child-turned-adult revisionist history risks running afoul of constitutional requirements and spirit, as discussed above.

It’s probably best, then, to forget about a childhood “right to be forgotten.” Perhaps the more appropriate right to consider is the “right to respond.” As long as data brokers and other third parties continue to aggregate and use private information, federal law or regulation could institute one or more centralized bureaus for the oversight and handling of this information. As part of such a comprehensive data broker, social credit, reputation bureau scheme,59 teens could be entitled to a credit report–type disclosure of the digital data about them that is out there when they turn eighteen.60 The bureau could have in place a mechanism for youth to respond or provide a counternarrative to the picture of their youth that is available in the digital space and to request corrections of specific information that the brokers and bureau themselves are storing.

Even painting with purposefully broad-brush strokes, many tensions and issues come into focus with such a scheme. Chief among them is that such an approach would add yet another layer of data aggregation and surveillance. Existing credit bureaus operate with a business model that could be described as letting the fox mind the hen house. Expanding that business model beyond financial data to other personal digital data would be like letting that fox open a restaurant and dish out eggs Florentine.

The following colloquy between Senator John Kennedy (Republican of Louisiana) and the former CEO of Equifax at a federal legislative committee hearing in fall 2017 lays out the heart of the credit bureau model with the precision of a four-star chef.61

Kennedy: You collect my [financial] information, without my permission. You take it, along with everyone else’s information, and you sell that information to businesses. Is that basically correct?

Former Equifax CEO: That’s largely correct.

Kennedy and his colleagues in the Senate went on to discuss the tensions inherent in this model of data collection and use. Credit bureaus, which are private businesses, collect sensitive financial data, like how much debt you have and how well you’re paying it back. They don’t ask your permission. They aggregate that data and analyze that data. They sell the data to other private businesses. Those businesses use that data to make monumental decisions about your access to essential opportunities, like getting a mortgage. Sometimes, those opportunities go beyond consumer purchase transactions, like whether or not you’re qualified for a certain job.

And when the credit bureaus fail to safeguard your data, like the massive Equifax data breach that Senator Kennedy was inquiring about, they offer you data monitoring and related services. These services may be free, but the chickens will come home to roost. For the credit bureaus, doing a crappy job may be the goose that laid the golden egg. They acquire a captive market of new customers who wind up paying them extra to do what they were supposed to be doing in the first place: serving as responsible and effective data stewards.

This business model is rather like an unscrupulous tow truck company that is hired by businesses to remove illegally parked cars from their lots. The car owners haven’t asked for their cars to be moved. And it’s not clear that removing these cars is the best way to free up access to these businesses or achieve any other commerce-related goals. But let’s stipulate that removing these cars is authorized by law and is essential to the free flow of commerce.

The towing company scoops up the cars but leaves them in the middle of the highway, where the cars get hit by other cars. The towing company tells the owners that it will tow their cars to safety free of charge. It won’t fix the cars because it doesn’t know how. (It’s a towing company, not Car Talk.62 Car Talk knows everything about cars and about talk.) But the towing company will get the cars out of the highway, preventing further destruction. The company takes those cars to a parking lot in the middle of nowhere.

The cars have crossed to safety, but as novelist Wallace Stegner has told us, the story isn’t over so quickly.63 If the owners want to regain full use of their cars and ensure their cars won’t be whisked away again, they need to pay the towing company for additional services. Indefinitely. That’s right: the company that you never asked to work with in the first place and that has led to the wreckage of your car, now has the chutzpah to ask you to pay it to do more work with your car. You don’t trust this company, but you have few or no other options for getting the work done. And the work needs to be done. Your car might keep running without the repairs, but disaster could strike at any moment. You’d rather not go from sixty to zero the hard way.

The car in this scenario is your credit history and all the personal information it contains—Social Security number, date of birth, addresses, debts. The towing company is the credit bureau, collecting this and other information without your explicit consent and often without your knowledge.64

A free-market capitalist system does need mechanisms that facilitate fair, efficient, and productive matches between borrowers and lenders. Lenders find money trees hard to grow, and borrowers find them hard to climb. For lenders, having a standardized screening for borrowers’ creditworthiness helps them run their businesses. For individual borrowers, having this system helps them access loan options quickly. The digital data system of the credit report and credit score seems like everyone has struck pay dirt.

If the hen house of consumer lending needs to be guarded, why are the Senators on the hunt for the credit bureaus? Because the bureaus are doing a rather lousy job, and the current legal and regulatory landscape offers few incentives for improvement. Credit bureaus lack transparency and accountability. They mess up a lot. They take in inaccurate information. They fail to fix it. Consumers need to request copies of their credit reports, review the information, then make a written request for any changes.65 That’s right: an industry designed to be a data broker isn’t worried about how broken its own processes and quality control are.

The bureaus know where we live. They know where we used to live. They probably can predict with a reasonable degree of certainty where we’ll be living ten years from now. So why can’t they drop us a note and let us know when something is inaccurate? And use our credit card information to book the movers, while they’re at it? If that’s too much trouble, could they at least let us know when someone steals our credit card information to book movers, move books, or pay bookies?

Well, they could. Sometimes, they actually do. Often, they make us pay them for the privilege of telling us when this theft has occurred. Take the Equifax data breach, which is believed to have resulted in the theft of private consumer data from over half of adults in the United States. Equifax’s failure to take straightforward data security measures made the theft possible.66 As soon as thieves have this information, it can be used for a variety of purposes that cause concrete harm to us, the rightful owners of the information, such as obtaining new credit or a fraudulent income tax return. Harm can also manifest in less tangible forms, such as the stress that comes from anticipating the possibility of future harm.67

Equifax told us all about the breach free of charge. Then it offered us monitoring and notification services to let us know if our stolen information was used for unauthorized activities. It also offered us the ability to lock our information so that it couldn’t be used at all. These offers initially were made on a “freemium” basis: free of charge for a period of time, after which payment was required.68 Even though Equifax backed down on some of its proposed charges, it still stands to benefit financially from the mess it created.69 Last one to sign up is a rotten egg.

Back to our hypothetical social credit and reputation bureau scheme. Let’s say it’s possible to establish a bureau that works effectively and fairly. Even with effective and fair functioning, such a bureau still would require a high level of digital literacy for young adults to navigate. There are other potential interventions in addition to or instead of this one that could promote forgetting. For instance, social media companies could offer an “auto-forget” option that applies to all content in an adult feed that is demonstrably about kids. This auto-destroy option would have a few advantages over current privacy settings, mostly that you could “set it and forget it” rather than having to go back through and manually remove or change the privacy settings for certain content as your kids get older.70 And it wouldn’t suffer from the problems of the childhood “right to be forgotten” scenario where kids have removal rights after they come of age.

Here, adults could make a decision such that they could enjoy the benefits of social media or similar digital technologies at one point in time and have their engagement with these technologies develop along with their children’s development. App developers could also get in on the action. There may be an emerging market for an app that is interoperable with major social media, retail, and other heavily trafficked sites that would opt out of posts about kids from any data aggregation and set an auto-destroy function.

There could also be legal limitations on the specific uses of childhood data over time. For instance, just as negative information on a credit report drops off after seven years, a schema for a data broker bureau could include a mandatory erasure of certain private and sensitive data about a juvenile after the juvenile turns twenty-five, which is seven years after the legal age of majority. Other tech, legal, and other types of devices could achieve similar or complementary results. Here’s the key direction: if the child is going to father the man, after a certain point, the child needs to fade away for the adult to enter stage left. Sometimes, you have to stop looking backward to go forward.


Notes

Comments
0
comment
No comments here
Why not start the discussion?