As a Unix geek-turned-iOS developer, Damian Esteban can trace his fascination with technology all the way back to the IBM PC Jr. As Damian watched his father set up the family’s first home computer, he was entranced by the way the whole machine seemed to come to life– and now Damian develops apps to bring life to new, innovative ideas.
Everybody is prone to bias in some regard. Much of it is largely innocuous; if you grew up in a certain city, for instance, you’d be more prone to have fond memories of it. Or not, everyone is different in that regard. However, in our bid to create artificial intelligence (AI) that accurately reconstructs human thought patterns, we have inadvertently introduced biases in language into the mix.
AI is one of the biggest technologies currently being utilized, studied, and expanded upon. It’s flexible and useful, allowing us to automate many processes that would take a human a long time to complete. Some aspects of AI, such as machine learning, allow us to create predictive analytics that can find patterns that would take years to discover manually.
However, as machines learn, they’ve picked up some not-so-desirable traits along the way as well.
Language is all about patterns, and as it turns out, AIs are particularly good at picking up on and interpreting patterns. Recognizing language has always been a central goal for AI, and even humble voice assistants such as Alexa and Siri are built on the idea that the ability to communicate can humanize even something as inhuman as a metal and plastic cylinder.
With a vast amount of text and data available to AI learning language, there’s a lot of context for these machines to pick up on. Even without overt biases, the inherent prejudice in every language does not go unnoticed. Unfortunately, without a way to reliably pinpoint and recognize the development of bias, an AI will just keep doing what it does without regard for what humans might find offensive or immoral. The difference here is that humans are able to recognize their own biases and consciously work to counteract them, something that AIs are not currently capable of doing.
This behavior was studied by Aylin Caliskan of the University of Princeton, along with Joanna Bryson and Arvind Narayanan. The researchers used word association, feeding an AI words and getting it to associate them with other words it thought were similar. When trawling through the vast amounts of data, the AI made connections between words frequently used in conjunction with each other, such as “fly” with “bird”, for instance.
When names overwhelmingly used by African-Americans were input, the AI was much more likely to output words with a negative connotation. Similarly, when it came to associating genders with professions, it was much more likely to associate men with professions such as doctoring and women with professions such as nursing.
One of the most interesting parts of the study was the way that the AI’s decisions reflected implicit instead of explicit bias. This implies that the very structure of language contains biases that we may not realize. Anthony Greenwald—the creator of the Implicit Association Test (IAT)—even commented on the findings, noting that the AI could possibly test for implicit biases in older works of writing, among other things.
Given the already significant negative impact that bias can have on our culture, it is important to tread carefully when developing future AI. New automation trends will result in AI having a greater impact on everyday life, and anybody working with these machines will need to come up with methods to counteract bias. Policing these systems is a complicated task, but a necessary one if we want to prevent our prejudices from continuing to shape our future.
I’ve written before about Human-Centered Design (HCD) and user experience design (UX), two interrelated concepts that help govern good application development, new technology practices, and just about any product or service that will ever come in contact with a human. Even if you’re not versed in these concepts, you are intrinsically familiar with them; after all, it can be easy to recognize when design is good or bad.
So, with these new design paradigms more important than ever, it’s worth examining certain industries and the way they have implemented these practices to account for user behavior. Perhaps no industry exemplifies the importance of good UX as the museum industry. After all, experience is the entire reason to visit a museum. Immersive displays and innovative ways of displaying information benefit websites, applications, and museums alike.
Part of the reason that these concepts have gained such traction in today’s environment is because people are often more interested in the lifestyle sold by a product or service than any other aspect. It’s why, for instance, Target often gets away with higher prices compared to Walmart; the extra price is for the perceived minor luxury of shopping there.
And for museums, competing with other attractions isn’t just about offering more information or something to do; it’s about giving visitors a story to tell. People have proven time and time again that they are willing to pay for experiences. Regardless of whether a museum charges or offers free admission, many have adapted to stay interesting to guests, even those coming for repeat visits.
In fact, one of the pioneers of what’s called “visitor experience” was Freeman Tilden of the U.S. National Park Service. Tilden believed in crafting meaningful spaces that account for the ways visitors interact with them. To him, eliciting emotion was the most important part of ensuring a memorable experience, and his six guiding principles have a lot in common with later UX standards.
For example, the Museum of Science and Industry in Chicago opened an exhibit called “YOU! The Experience” in 2008. Designed to add a layer of personalization to the often-trite topic of anatomy, the exhibit allows visitors to interact with its displays, synchronizing stations about bodily functions such as heartbeats to the bodies of visitors.
This is a central tenet of UX—whether an application or an exhibit, the experience you provide should account for common behaviors and even encourage them, making their end goals easier. For museums, this is often expressed through their organization of information. In many historical museums, layouts will often move visitors through a specific chronology of events, perhaps following an individual through history as a television show would follow a central protagonist. The “Star Wars and the Power of Costume” exhibit at the Denver Art Museum takes on a different kind of chronology, following not the release of the films but the gradual creative processes that made iconic characters a reality.
Whether art, history, or science, museums have the daunting task of engaging and entertaining visitors for the duration of their stay. Accomplishing this requires a firm understanding of the psychology and common behaviors of visitors, in the same way that good UX in technology hinges on discovering and taking advantage of usage patterns. The next time your visit a museum, consider the way it presents its exhibits to guests—and the ways that its information hierarchy could resemble the design for other technologies.
The role of the CEO is changing, a shift due in part to new technology.
CEOs have always been responsible for growth in the companies that they lead. For all of the stereotypes of penthouse suites and high-octane business meetings, the bottom line is that this class of leaders bears the burden of innovating and moving forward. And nowadays, stagnations bears an even higher price than it has in the past.
The age in which we live is akin to a new industrial revolution, where exponential improvements in technology shape every industry and aspect of life. On one hand, this makes it easier to innovate, and gives companies the freedom to deliver better strategy, products, and services. On the other, it is now a lot easier to be left behind in a short period of time.
There is an old guard of CEOs that has been slow to adapt, and they have been punished for it. Once-iconic companies, such as Blockbuster, have become irrelevant for failing to see the writing on the walls. Others, such as Netflix, have evolved to embrace new technology and are now staples because of it.
The Netflix/Blockbuster dichotomy is perhaps one of the most jarring examples of companies reacting to new technological developments, but all industries are impacted in some way. This is why any CEO should be willing to learn about these changes and the ways that they may be affected. Not every CEO is going to be a tech expert—which is why it is important for them to stay humble and defer to the expertise of others in their companies that may have better insight into these issues. However, it is worth it for them to have enough of a working knowledge to move forward and be proactive instead of just reactive to these changes. It’s not enough for a CEO to copy a competitor’s technological adoption because it’s an “industry standard”—they should strive to improve upon these practices and identify new opportunities.
This is why some of the world’s most prominent tech CEOs have stressed the importance of creating a “learning culture” in their companies. While opinions differed in some respects, the general sentiment was that challenging old opinions and surrounding oneself with talented individuals was a path to success. None equated their success with solely their own prowess; perhaps insight into the broader tenets of leadership.
The renewed focus on learning and growth has contributed to the importance of the Chief Technology Officer (CTO) and the Chief Innovation Officer (CINO). Both positions exist to identify areas of improvement and keep informed on the many opportunities available in any given industry. These roles started out as endemic to technology-focused companies, but since their inception, have spread to more and more businesses as it has become increasingly obvious that everyone is along for the ride when it comes to new developments.
Prioritization has become more important because of the pace of advancement. If a CEO is too broad in their examination of technology, they can easily lose focus and company tempo as a result. They need to be able to process information as fast as possible and determine whether a change should be passed over or is worth examining more closely. This also extends to internal company structure; organizations such as boards may be due for an audit if they have stagnated. For a CEO, it’s worth evaluating top performers in a company and making sure that they are in a position to facilitate innovation.
CEOs are being forced to reexamine themselves and the companies that they lead. The pace of new developments has created an environment in which no company is safe and their fortunes rest on a CEO’s ability to adapt. However, with a mindset of continuous learning and a strong team, any leader can prepare their organization for the future.
About Damian Esteban
Damian was most recently the lead developer at Spare Change, Inc. where he focused on FinTech in mobile and web applications. He is a strong leader who believes that tight-knit teams can accomplish truly amazing things. Damian graduated from SUNY Geneseo in 2000 with a Bachelor of Arts in International Relations and McGill University in 2003 with a Master’s Degree in Islamic Studies. He attended the New York Code and Design Academy in 2013.
- Unix Server Administration
- RESTful API Design
- Reactive Programming