It is the best of times. It is the worst of times. And determining our current, personal outlook often depends, from moment to moment, on our relationships with technology. Does our smart car recognize our hands-free key, or are we locked out and exasperated? Did Alexa play the hip-hop hit we asked for, or did “she” mistakenly order a hundred dollhouses to be delivered?
Our relationship with technology is pervasive and powerful and goes well beyond the short-term gratification or frustration of a GPS interaction with Siri. Technology does things like restructure the job market, dictate how we interact socially, influence how we learn and encourage us to buy more stuff.
We love our computers and smartphones, but let’s be honest, we also are wary — even fearful at times — wondering how much “they” know about us. And is our data being used in ways that are fair?
Current research by University of St. Thomas computer scientist Dr. Carlos Monroy explores fairness and bias in algorithms and analytics systems. The data analytics professor counsels that better understanding of technology’s potential dangers is an essential part of learning how to limit it and, therefore, be able to enjoy its vast benefits without fear.
How “They” Learn About Us
Let’s begin with how they do it. Data gatherers like Google and Amazon write computer programs called algorithms, which pay attention to what we read and watch. Algorithms feed something called artificial intelligence (AI), and it learns what we are like. Every time we click on something, we’re submitting information about us — information that can be used by companies that are willing to pay for it.
“Here’s an example for you,” Dr. Monroy said. “I grew up in Guatemala. In my Houston home, we have a smart TV. We can be watching an NFL game in English, and it shows me commercial advertisements in Spanish. How does the TV know to do that? They know because AI assumptions are made from my online activity.”
Similarly, when Monroy reads news online, he tends to be presented with the same types of stories he has clicked on before or pop-up ads for a car similar to one he looked at previously.
TIP: Disinterest also is noticed by the data gatherers. If we want to stop seeing certain types of aggravating posts on Facebook, all we need to do is stop clicking to expand and read them.
What They Know About Us
Just from following digital trails, Big Data companies know how old you are, where you went on vacation, whether you’re in a relationship, anything you searched for online, what sorts of movies you prefer, where you live, what you ask of the virtual assistant Cortana, what you like or don’t like on Facebook, how smart you are — and that’s just for starters.
The information gathered about us becomes part of enormous data sets referred to as Big Data. They are used to spot patterns and trends that relate to human behavior.
One important example of how such valuable information is used would be in mass transit. Public transit planners benefit by seeing where the population is going. Where are the potential riders who really need new bus or rail service to get to their jobs? “Smart schedules” generated by the data even predict how many buses or trains are warranted on which days and times.
“With those patterns and trends, Big Data decides fates,” Dr. Monroy said. “For instance, it knows our credit score and very efficiently determines whether we can get a mortgage loan or not — and what our interest rate should be. It decides who gets into some colleges, who gets a particular job, and who gets out of prison on bail.”
When Tech Pushes Our Buttons
In spite of all the ways that technology makes our professional and personal lives easier and more efficient, we acknowledge a gnawing uneasiness about our increasingly digital dependence. We hesitate to open the email from an unfamiliar sender. We mask the camera lenses on our computer.
Dr. Monroy believes that some of the discomfort comes from a frequent requirement that we interact with machines instead of humans.
“One of my former students went to interview for a job, thinking he would be sitting opposite a hiring representative and answering questions,” Dr. Monroy said. “He was disappointed when he was directed to a computer. He sat and keyed in his responses to on-screen questions. The experience was awkward and made him less interested in working there.”
What is even more common, and disconcerting, these days is for online job applicants to hit the green submit button only to have their applications disappear into some electronic black hole. Why? In some cases, the answer could be discrimination, plain and simple.
It happens. In 2015, Amazon announced that it was eliminating its online employee recruitment tool after learning that the tool had been discriminating against women.
The fact is that people write algorithms, and people have conscious or unconscious biases.
Fairness Dilemma and the Secret World of Algorithms
“As humans, we can assume that unconscious bias winds up being part of our algorithms and presents ethical questions,” Dr. Monroy said. “Are certain segments of the population being left behind because our algorithm is geared to more preferable segments?”
Dr. Monroy’s research examines the “what ifs” of algorithms. What if an algorithm contained a bias against people of a certain age, people of a certain race or gender, or people who have a history of getting mental health help?
“It’s a question of ethics and fairness, and it’s the right question to be asked in research here at St. Thomas,” Dr. Monroy explained.
Research like this is tricky, because Dr. Monroy cannot simply see and dissect the algorithms of Big Data companies like Google or Facebook. Those computer formulas are proprietary and protected. Secret for competitive reasons.
Considering the impact of these technologies in our lives and being mindful of Plato’s thoughts on fairness, the data analytics expert will add imagination to his process.
“We will look at how unconscious bias could show up, so that developers can be aware and design to avoid bias,” he said.
Dr. Monroy believes the knowledge gained from his research is important and will serve to ease public concerns about Big Data.
In the future, companies may be compelled to disclose what their algorithms seek, much like a breakfast cereal box tells us what ingredients are inside.
“They should,” Dr. Monroy said. “If a decision is made that impacts you, you should be able to see how the decision was made.”
Students Use Tech to Help Houston
When he is not doing research, this advocate of technology might be mentoring his computer science students as they develop and use computing technology to help the City of Houston address problems such as homelessness and traffic accident reduction.
“The students are thinking creatively about issues affecting Houston and putting computer muscle behind the economic, social and geographic information made available by the City,” Dr. Monroy explained. “This is UST’s contribution to the Station Houston consortium.”
Given the ubiquitous presence of the computer and smartphones, it’s hard to see where humans and technology are separate. We buy our airline tickets online and text our loved ones to communicate the purchase. We support our health by counting steps on a Fitbit. We drive cars with sensors that alert us when another car is too close. Technology is here to stay.