Written by Aaron “Hoss” Heise, Five-Star Software Consultant With J. Geiger Consulting

The Proliferation of Thinking Computers

Artificial intelligence is getting a lot of attention these days, particularly in the field of computer vision and the adjacent field of self-driving cars. We already know that there’s some of that magic dust sprinkled on web search and especially the advertising that goes with it. There was the “big data” push a few years back, and in fact many, if not most, medium-sized or larger businesses already use a business intelligence tool of some sort for detecting and predicting various trends and patterns. And of course, the smartphone era has brought a personal touch to the machine with a digital assistant whose immense usefulness is hotly debated by rival manufacturers’ marketing teams—but really just makes us look like dweebs to our passengers when we say, “directions to the nearest gas station” for the third time, with dictation level set at “gentle mist”.

Supercomputer Not Required

Cloud architectures blur the lines, especially with smartphones, of where the heavy lifting of algorithms happens, and by extension, how much power is needed to crunch this info. Apple has stated that personal information available to Siri is never sent up to the cloud, but at the same time, Siri will not work without a connection to the Internet. Business Intelligence servers are usually some of the largest and most expensive machines in the enterprise (though a significant portion of that cost comes from the rather large price tag dangling from the software itself). Self-driving car prototypes we’ve seen have been controlled by computers ranging from what looks like a motherboard buried in a thicket of wires sitting in the front seat to a retro jet pack of sorts strapped to the roof of a commodity sedan. You probably know (or have been) someone who got a ticket from a camera on top of a stop light for driving through a red light—presumably analyzed by rows of racks of computers in some government basement computer room with printers and envelope stuffers on the ends. (But the topic “options for municipal money printing” will have to wait for another article, I’m afraid.)

This all makes for some misleading information regarding the infrastructure needs of machine learning. As a consultant, I’m duty-bound to say, “it really depends on your application”—and of course immediately followed with, “could I set up a quick call to discuss your needs in more depth?”—but there are many options and so it really does depend. However, what’s important to note is that though the applications of AI that we notice have at best an unknown footprint and in many cases a known large footprint, quite a bit of AI that we regularly contact uses far less power. The smartphone alone has probably a few dozen places AI is used to aid the user: face recognition, app suggestions, and reminder locations to name a few. These features are usually so small they’re not even noticed.

I’m Not a PhD

If the perceptions of infrastructure unknowns or magnitudes didn’t scare you away, the high viscosity of the pure science surely will. It only takes a couple clicks from the “machine learning” Wikipedia page to get to formulas that anyone without a statistics degree would choke on. This stuff is driven by complex numerical relationships that take years of study to really understand. But, over the last decade, multitudes of machine learning software libraries for a replete set of platforms have been created by people who do understand those relationships. Many of them are free and most others are not expensive. As a result, it is now possible to implement AI with just an understanding of how to feed the algorithms, without needing to know how to implement the algorithms themselves.

This means that we can build AI into our web apps, mobile apps, fat clients, front ends, back ends, and a whole list of other buzzwords I deleted before this went out. It’s still not a drag and drop application of the technology (though there are business intelligence solutions out there that can provide some of this capability for $lots) and requires some level of knowledge, but computer science and statistics-heavy business grads will likely already be equipped to make sense of it with only minor to moderate consternation.

Next Time

In part two, I’ll identify some real-world business use cases and describe practical implementations that work with IT infrastructure that most businesses running custom software already have, with minimal if any increase on system load. Where large-scale machine learning has helped us forecast optimal inventory and staffing levels by finding strong correlations in what sometimes seems like random data, small-scale machine learning can uncover new relationships in data that can change users’ interactions with applications in ways that would have previously been all but impossible.

Copyright © 2017 Aaron Heise, used with permission.