Recently at the World Economic Forum in Davos, a panel of industry leaders broached the topic of diversity, again. Of particular note was an example given by Joi Ito, Director of Massachusetts Institute of Technology (MIT) Media Labs, demonstrating one of the many risks in the tech industry presented by a lack of diversity in thought and experience:
“One of our researchers, an African-American woman, discovered that in the core libraries for face recognition, dark faces don’t show up. And these libraries are used in many of the products you have.”
This oversight might be traced back to the fact that the lab where the AI was built and tested had few people of color with which to test the system – or indeed, for that issue to be considered in the first place. “One of the risks in the lack of diversity among the engineers is that it is non-intuitive about which questions you should be asking,” said Ito.
As a Black female engineer with over a decade of experience and an undying entrepreneurial spirit, my immediate concern here is a reduced profit margin. The word would spread that said product does not function as advertised, or at all, for approximately 13.2% of the potential U.S. customer base. The product may fail – simply due to an apparent disregard for the need of inclusion amongst its creators.
For me, diversity is quantifiable via distribution of equity, diverse representation in leadership, and decision making power.
My hope is that we can all agree that authentic diversity and inclusion is lacking not just in the emerging field of AI, but in IT and tech… period, full stop. So, let’s take a moment to focus on engineering a solution to this problem.
Opportunities to test quantifiable solutions to our inclusion problems are both plentiful and ubiquitous. The AI skill gap is a great place to begin. Per the Vanson Bourne survey for Infosys, “Amplifying Human Potential,” more than four in ten employees do not feel they have the necessary development, security, or implementation skills required to use AI in a meaningful way. Only a minority (pun intended) of decision makers believe their organizations have the AI-related training or customer-facing skills needed to move AI forward in their businesses.
Let’s take advantage of the ever so rapid changes happening across the tech spectrum - as IBM CEO Ginni Rometty has noted, we do indeed have an opportunity to create the ‘new collar’ jobs required by many in-demand industries (e.g. cyber) but as yet, these remain largely unfilled.
According to a recent U.S. Department of Labor report, there are over half a million technology-related job positions currently open and available in the U.S. Those are significant numbers. Why are they unfilled? Success in the global markets requires a diverse set of workers with a consistently evolving skillset. A paradigm shift is not optional if innovation is to continue. The competition for what most consider a small pool of qualified candidates is fierce. Per Rometty, “At IBM alone, we intend to hire about 25,000 (technology) professionals in the next four years in the United States, 6,000 of those in 2017. IBM will also invest $1 billion in training and development of our U.S. employees in the next four years.” Let’s use inclusion to increase the qualified candidate pool.
The concept or fact that employment in the information sciences requires an adaptive skill set is not new, but we have yet to harness its full utility. Continuing education and training initiatives must be created, led, and implemented by the picture of diversity − from curricula development through delivery.
So, instead of simply using women and people of color to demonstrate the utility of a product, let us also use women and people of color to model the product development lifecycle. This is where we can begin demystifying our research, stop just talking about the need for diversity, and start developing processes that will create a new and more correct narrative. It’s about applying engineering and product development techniques to solve an abstract, but crucial, real-world problem.