Turn Over Child-Raising to a Computer?
Brendan Dixon
When our kids were small, my wife and I used a baby monitor. It was quite basic: We could hear our kids (when we left the volume up) and watch a jerky jiggle of red lights as noises ebbed and flowed. But the monitor kept us engaged with our kids. We knew better than to answer their every squeak and cry, but were concerned (and what young, new parents are not?) to know if some real danger or problem arose. The monitor covered the distance between our actual and imagined fears.
Our baby monitor made us better, not poorer, parents. If the kids needed a nap, we could, without risk, lay them down and go on to other tasks. It freed us to garden, wash the cars, do the laundry, or whatever, and remain connected to our kids. In a sense, it extended our abilities. We remained the parents. We were still "in charge." The monitor was a tool to help us do better that which we already had to do, that which was ours to do.
That describes the proper use of technology: Good technology extends us so we can be better humans. Tools that ease work, enhance connections, augment vision, and so forth enlarge us. They enable us to do better that which we need to do and that which we ought to do. Technology, however, designed to squeeze humans out by replacing our unique skills makes us less than we are. Granted, we can use some tools either way, to extend or to reduce us. But raising children is uniquely human and best not replaced by machines.
So, while we owned and benefited from a baby monitor, Mattel's new "smart baby monitor," their digital nanny dubbed Aristotle, leaves me flummoxed. Mattel, and their partner Microsoft, introduced the "Echo for Kids" at this year's Consumer Electronics Show. (Let me be clear, before going further, that, while I work for Microsoft, these are my opinions. I, in no way, represent Microsoft nor am I expressing corporate opinions on such matters.) To calm parents, Mattel and Microsoft assure that they take security and privacy seriously, following and exceeding the relevant government standards (such as, COPPA) with additional protections (since no parent wants their digital nanny to answer an innocent request with pornography).
Sidestepping Aristotle's "perky, 25-year-old kindergarten teacher" female voice, Microsoft and Mattel engineered the nanny to "read kids bedtime stories, help teens with their homework, and auto-soothe babies when they wake up wailing." Three different Artificial Intelligence engines power Aristotle. They can identify each child by voice, instruct them, and change behavior as the child grows from toddler to early-teen. Parents can also configure Aristotle; for example, to withhold that bedtime story if child fails to say "please." As one reviewer wryly comments, "never will you have to touch your child again."
Neil Postman noted our inability to measure all things or reduce them to numbers (what would it mean, for example, to say I am 31.6 percent less handsome than Bill Gates?) without loss. Neither can we reduce all things to automation without twisting the tasks into something else. Automation, even trendy, AI-driven automation, entails algorithms. Algorithms, and this includes those guiding "unsupervised" Machine Learning, encode decisions and perspectives. Someone, somewhere, somehow told the machine that this thing matters and that thing does not. Information does not arise spontaneously from matter. Digitizing divides, into zeros and ones, the flow and flux of the world. Something will get missed. Something will get cut out. Something will be valued over another thing. Neutral software does not exist.
Raising children is not a task we can or should automate. Raising a child entails training them in to live fully into that which they are: A person with gifts and abilities. Parents are responsible to inculcate values in their children. Parents, through example and training, teach children how to move through and contribute to society. And, what's more, as every parent who regrets after-the-fact uttering an inappropriate phrase, children learn at least as much by observation as by instruction. My wife and I, more than once, wondered if instruction was a waste of breath and that our children only learned by watching.
What would a child learn from a digital, even if AI-empowered, nanny? Certainly not how to be fully human. Certainly not how to behave within society. Certainly not those values and traditions and choices that make a family unique. Handing off our children to these tools reduces, not expands, our humanity. We, and our children, end up as less than we should be.
I cannot predict the future. But, I suspect, Aristotle will eventually go the way of Sony's AI dog, Aibo. Machines can be stunning, helpful tools. But, even "Deep Learning" Artificially Intelligent machines with "convolutional neural networks" are pitiful replacements for human beings. Good technology amplifies the best of us, inhibits our faults, and promotes the flourishing of the planet. Technology that replaces humans, devalues our unique gifts, and spoils where we live is not technology we should pursue.
Brendan Dixon
When our kids were small, my wife and I used a baby monitor. It was quite basic: We could hear our kids (when we left the volume up) and watch a jerky jiggle of red lights as noises ebbed and flowed. But the monitor kept us engaged with our kids. We knew better than to answer their every squeak and cry, but were concerned (and what young, new parents are not?) to know if some real danger or problem arose. The monitor covered the distance between our actual and imagined fears.
Our baby monitor made us better, not poorer, parents. If the kids needed a nap, we could, without risk, lay them down and go on to other tasks. It freed us to garden, wash the cars, do the laundry, or whatever, and remain connected to our kids. In a sense, it extended our abilities. We remained the parents. We were still "in charge." The monitor was a tool to help us do better that which we already had to do, that which was ours to do.
That describes the proper use of technology: Good technology extends us so we can be better humans. Tools that ease work, enhance connections, augment vision, and so forth enlarge us. They enable us to do better that which we need to do and that which we ought to do. Technology, however, designed to squeeze humans out by replacing our unique skills makes us less than we are. Granted, we can use some tools either way, to extend or to reduce us. But raising children is uniquely human and best not replaced by machines.
So, while we owned and benefited from a baby monitor, Mattel's new "smart baby monitor," their digital nanny dubbed Aristotle, leaves me flummoxed. Mattel, and their partner Microsoft, introduced the "Echo for Kids" at this year's Consumer Electronics Show. (Let me be clear, before going further, that, while I work for Microsoft, these are my opinions. I, in no way, represent Microsoft nor am I expressing corporate opinions on such matters.) To calm parents, Mattel and Microsoft assure that they take security and privacy seriously, following and exceeding the relevant government standards (such as, COPPA) with additional protections (since no parent wants their digital nanny to answer an innocent request with pornography).
Sidestepping Aristotle's "perky, 25-year-old kindergarten teacher" female voice, Microsoft and Mattel engineered the nanny to "read kids bedtime stories, help teens with their homework, and auto-soothe babies when they wake up wailing." Three different Artificial Intelligence engines power Aristotle. They can identify each child by voice, instruct them, and change behavior as the child grows from toddler to early-teen. Parents can also configure Aristotle; for example, to withhold that bedtime story if child fails to say "please." As one reviewer wryly comments, "never will you have to touch your child again."
Neil Postman noted our inability to measure all things or reduce them to numbers (what would it mean, for example, to say I am 31.6 percent less handsome than Bill Gates?) without loss. Neither can we reduce all things to automation without twisting the tasks into something else. Automation, even trendy, AI-driven automation, entails algorithms. Algorithms, and this includes those guiding "unsupervised" Machine Learning, encode decisions and perspectives. Someone, somewhere, somehow told the machine that this thing matters and that thing does not. Information does not arise spontaneously from matter. Digitizing divides, into zeros and ones, the flow and flux of the world. Something will get missed. Something will get cut out. Something will be valued over another thing. Neutral software does not exist.
Raising children is not a task we can or should automate. Raising a child entails training them in to live fully into that which they are: A person with gifts and abilities. Parents are responsible to inculcate values in their children. Parents, through example and training, teach children how to move through and contribute to society. And, what's more, as every parent who regrets after-the-fact uttering an inappropriate phrase, children learn at least as much by observation as by instruction. My wife and I, more than once, wondered if instruction was a waste of breath and that our children only learned by watching.
What would a child learn from a digital, even if AI-empowered, nanny? Certainly not how to be fully human. Certainly not how to behave within society. Certainly not those values and traditions and choices that make a family unique. Handing off our children to these tools reduces, not expands, our humanity. We, and our children, end up as less than we should be.
I cannot predict the future. But, I suspect, Aristotle will eventually go the way of Sony's AI dog, Aibo. Machines can be stunning, helpful tools. But, even "Deep Learning" Artificially Intelligent machines with "convolutional neural networks" are pitiful replacements for human beings. Good technology amplifies the best of us, inhibits our faults, and promotes the flourishing of the planet. Technology that replaces humans, devalues our unique gifts, and spoils where we live is not technology we should pursue.
No comments:
Post a Comment