Your public school education exists, in large part, thanks to the Second Industrial Revolution. When the revolution took hold of America in the 1870s, 30 years after the end of the first, half of the US population still spent their days toiling in fields. Education was typically voluntary, assuming the family was wealthy enough to afford tutors or school fees, and usually reserved for boys. With the development of commercial fertilizer and the internal combustion engine, productivity exploded while the number of farmers dropped to less than two percent of the population. It lessened the demand for child labor which in turn led to increased support for compulsory education for both sexes.
“The government at the time recognized [the need for public education],” TIm Weber, Global Head of 3D Printing and Advanced Applications at HP, told Engadget. “Basically to uplift the skills of people in the United States to adapt to an industrial revolution.”
The world is currently in the midst of its 4th Industrial Revolution — one driven by information and automation. As with previous revolutions, today’s technological advancements are threatening to upend established industry and labor practices through overwhelming productivity increases. Artificial Intelligence and machine-learning systems are not just fundamentally shifting the ways we interact with computers and data, they’re also changing how we’ll manufacture the modern world.
We’ve been using robots to augment (and to a degree, replace) human efforts on the assembly line since the days of Henry Ford. Automation and AI are simply the next logical step in that advancement. Robots can serve in a variety of roles, from the design and prototyping stages through production and shipping.
The days of “dumb” production line robots that repetitively weld or rivet in a preprogrammed sequence without fail are coming to an end. Tomorrow’s factories will run themselves and coordinate along the entire supply chain, with human oversight of course, but they won’t look — or operate — like any manufacturing facility you’ve seen before.
“I believe that we are going to see localized manufacturing,” Weber said. Rather than monolithic industrial centers, he figures that with automation and additive manufacturing, we’ll be able to tuck more, but smaller, production facilities closer to the populations that they serve.
“I think about it like the Amazon Marketplace,” Weber said. Companies from all over the world gather there to do business under the Amazon banner. Weber envisions a day in which, while the designer of that toaster oven you’re about to buy may live in Lithuania, when you hit the order button, the toaster would simply print out at a local production facility for you to pick up.
No fuss, no muss, no international tariffs or shipping fees. “Through 3D printing, fast automation, artificial intelligence, advanced IT systems,” Weber said. “All that’s going to eventually have manufacturing go local again.” But it’s really not as simple as installing a 3D printer in a storefront and calling it a day.
“You need to have smart machines before even thinking about having a smart manufacturing system. These machines collect and produce data that are needed for AI,” Vibhu Bhutani, Chief Strategy Officer, Softweb Solutions, said during a recent panel discussion hosted by the Industrial Design & Engineering Show. “The next building block is to have a platform that these smart machines can connect with and that enables data collecting into cloud services.”
That data then needs to be ingested by a data analytics platform and worked into actionable instructions that the smart machines can understand, Bhutani continued. “These building blocks create a connected manufacturing floor and once that is in place you can use smart manufacturing,” he concluded.
Those smart machines will likely be of the additive variety if Weber has any say. “HP has actually been working on 3D printing for years,” Weber said. “But we never took it to market for a couple of different reasons,” citing the lack of a truly disruptive product and the relative smallness of the market.
But that changed when the company developed its jet fusion printing system which, according to the company, is capable of making production-quality parts up to ten times faster than other 3D printing systems. Suddenly HP could shift from that $6 billion prototyping market, Weber explained, into the production market which is worth an estimated trillion dollars annually.
But before HP — or any other company — can take that chance, Weber says six factors must first be met. First, production capability: As Bhutani pointed out you first need the physical machines that will do the producing in the scale that you need. “You have to have machines that are manufacturing capable,” Weber said. “People are going to be running these machines, not as a small $500 consumer thing in your garage, but a machine that can basically run as part of a factory.”
Second, the industry needs to drastically expand the kinds of materials that it works with, Weber explained. Current Injection molding technology is compatible with upwards of 50,000 different kinds of materials, many of which are specialized for very specific applications. 3D prototyping systems work with around a half dozen materials on average. Expanding that material palate will allow the industry to create items for increasingly diverse applications.
The third factor is price. While HP’s 3D printer is fast, it’s still far more cost effective to use traditional production methods — around two orders of magnitude. Design quality is the fourth factor. As 3D printing technology becomes more prevalent, we’ll begin to see the ways in which products are laid out change as well. As parts that used to require welding and rivets to be fit together are now printed fully formed.
The supply chain and regulatory reform are the fifth and sixth factors, respectively. HP is facing many of the same challenges as the United States Marine Corps in terms of additive manufacturing’s effects on its supply lines. No longer having to ship, inventory and store replacement parts is the antithesis of the modern supply line and a proportionally disruptive proposition. What’s more, HP has to worry about some issues that the USMC doesn’t, like tax. Shipping, warehousing and inventory, “those economics are going to change as part of this industrial revolution,” Weber said.
“I think the mechanical engineers and the designers of the world — the people who figure out how to make things and how to make them look pretty and functional — you’re going to see more of that,” he said. “More opportunities to customize and figure out new ways of doing things so the demand for engineers and designers is only going to increase.”
Economics won’t be the only thing to change. The roles of human workers are sure to as well. No, we’re not all going to be put out of work by robots (probably to start). A 2015 study from London’s Center for Economic Research found quite the opposite, in fact — that automation is actually helping boost productivity rather than cost jobs. “I think it will have a positive effect, which will increase the number of human innovations,” Ngai Zhang, Technology and Patent Law Attorney, told the IDE Show panel audience. “There is an opportunity for collaboration between AI systems and humans, not necessarily replacing humans.”
Sure, the dullest, most repetitive (and most often lowest paying) jobs will be outsourced to robots, but it’s not like any of us really wanted to do them anyway. Plus, it’s also opening opportunities for new roles for people. “You’re going to need many more trained technicians to basically operate the factory, to maintain it, and to kind of keep things running,” Weber said.
Thomas Howard, Director of the Robotics and Artificial Intelligence Lab at the University of Rochester, is inclined to agree. His lab recently trained a Baxter assembly robot to understand and respond to natural language commands. When given the instruction “pick up the middle gear in the row of five gears on the right,” the robot first converts the audio instructions into into text then uses that to determine where within its working environment it needs to move.
Cameras mounted in the Baxter’s arms help it quickly and accurately identify the correct gear before the grasper clamps down around it. Such a system could see factory workers of the future partnered with AI-driven assistants.
“There’s a clear role to be played by a repeatable precise robotic system and a human,” Howard said. “That assembly process or logistics process is made more effective” by allowing a human to “more directly dialog with a robotic system.”
Researchers from MIT’s Computer Science and AI Lab (CSAIL) recently revealed their similar efforts, which they’ve dubbed ComText — as in “commands in context”. Rather than have the robot waste CPU cycles transcribing and interpreting commands itself, MIT’s ComText leverages an Alexa API to interpret the user’s commands. It’s basically a Skill like any other on the virtual assistant platform. You tell it, “Alexa, tell robot to pick up the box of crackers I just set down” and boom, the Baxter does so.
The ComText system is able to understand and react to new scenarios as they occur because it’s machine-learning mechanism doesn’t just study up on semantic data (ie “the sky is blue” or “cats are jerks”) but also episodic data, specific things it’s learned from previous experience. So if you hold up a hammer and tell it “This hammer is my tool” the system will add that data connection between the hammer it sees and the idea that it is “my tool” to its knowledge database. But, if you put that hammer in a toolbox and then tell the system to “pick up my tool that I just put down,” the system will understand both that the tool in question is your hammer (semantic) and know to look for it in the toolbox, since it just watched you put it in there (episodic).
Though robots have made impressive gains in mobility and autonomy over the past few years, CSAIL researcher and co-lead author of the ComText study, Rohan Paul, said. “What is lacking is robots’ inability to understand high level concepts. And this is very important if robots have to operate in human-centric environments — where the human isn’t just supervising in-the-loop but actively working in-the-loop.”
The current problem is that robots generally see the world at a relatively low level — in pixels and sensor readings — but humans see it as related concepts, connected to form reasoning and higher order thinking, Paul explained. The goal of the CSAIL team’s experiment was to bridge that dissonance.
“We imagine untrained users would be able to able talk to it and, with a human plus a robot, efficiency should increase,” he continued. “We’re entering an era where humans and robots will be working together.” The question is “How do we execute tasks efficiently with both a human and robot working together,” not just an autonomous machine.
“You want to leverage both the human capabilities and the robot’s capabilities together and communication is an important part,” Paul concluded. Taken together, these advancements in AI, autonomy, 3D printing and robo-communications are poised to fundamentally change the face of modern manufacturing. Rather than monolithic centers of industry, we may soon see a smaller, localized production base with specialized shops coordinating with each other along the supply chain to maximize efficiency while minimizing waste and cost. It’s an exciting vision of the future but one that remains tantalizingly beyond our grasp for now.
Read the full article in Engadget.