Meet your future robotic colleagues

There’s been a lot of talk about robotics and automation systems recently. Many stories have emerged of companies developing such systems for business.

So EM360º thought it would be a good idea to round up the robots – physical and virtual – and see what they have to say for themselves.

Definitions

First, it may be worth making some distinctions between the types of technologies available, and whether they are currently useful, or may be useful in the future.

A lot of technologies promise to be useful in the future, so we’ll highlight systems that are closer to being commercialised.

Artificial intelligence

At the heart of all these robotics and automation systems is one or more forms of artificial intelligence.

Whether it’s machine learning, deep learning or some other form of AI, this is the key component that arguably distinguishes a robotic system from one that is simply automated.

While it’s probably beyond the scope of this article to go into the specifics about AI, it’s worth trying to describe in simple terms.

AI is basically a computer program or system that can analyse data and make decisions.

Of course, the human programmer – or system engineer – has to decide what data he or she wants their AI system is to collect or analyse, and also has to set the parameters of the AI system’s decision-making.

Robotics

Whatever its roots, the word “robot” has come to mean a lot of different things, including software-only systems.

But for this article, we will stick to the version which says it is a machine, which moves and makes autonomous decisions.

Of course, such a machine needs AI and other software, but let’s keep them separate because even by our “machine-that-moves” definition, it can get complicated.

For example, cars are increasingly becoming robotic, with their autonomous parking, autonomous braking, and a growing list of other highly automated functions which are collectively called “advanced driver assistance systems” in the automotive industry.

The main types of robots that perhaps many people might think of are the industrial robotic arms used in manufacturing, and humanoids from any number of works of science fiction and, increasingly, real-world companies.

Clearly, cars and industrial robotic arms are useful – at least half the world would probably come to a halt if they stopped working.

But humanoids are yet to find a mass market, although some say that’s on the way, as they become more capable and the prices come down.

Automation

Automation is probably the most technically accurate term for this whole area.

Most people – in the industrial sector at least – probably think of robotics as a sub-branch of automation.

Also, it’s often said that what is described as AI today will be described as automation tomorrow.

For example, optical character recognition systems – where a scanner could scan a printed-text document and produce a typed, digital document from it – were initially described as artificial intelligence.

Most of us were probably impressed by the technology at first, but we probably don’t think of it as AI now.

What is described as AI these days seems far more complex, sophisticated and capable. But someday in the future, these capabilities, too, may be described simply as automation.

Bots

This is a useful word because it has helped to draw a distinction between a physical robot and a virtual one.

“Bots”, as a lot of people will know by now, is generally used to mean computer programs or systems which can behave like a human would, inside a single computer or on a computer network.

Essentially it’s a computer program, but it’s a humanoid program – if it could be called that – or a virtual humanoid.

Many types of bots are already in use and much more are in development, and, while not all of them are perfect, they are very useful to a wide variety of companies.

From chat bots that can autonomously chat with a visitor on a website to a bot which can visit thousands of websites in a matter of seconds to find information you may be searching for, bots are many and varied.

The most challenging type of bot to develop, perhaps, is an email bot. Most people may have used “auto-response” emails at some point, but what if it could keep on auto-responding, and responding in a way that is appropriate to the emails it is responding to?

The word “bot” is probably not favoured by the companies which are developing “virtual robots” to deal with emails on your behalf, manage your diary and do all sorts of basic tasks like that, since the word “bot” probably wouldn’t do these sophisticated and useful bots any justice.

A company called x.ai is probably one of the better-known companies developing this type of bot, but the company refers to its system as a “personal assistant”, an obvious reference to the administrative tasks that it can perform.

There are numerous other companies developing such software, and a list can be found at G2 Crowd.

Some users of such “personal assistance” bots say they are very useful and help save a huge amount of time, but we haven’t evaluated any of them, so we can’t say.

One bot – if it can be called that – we have tested is Google’s speech-to-text conversion bot, through Google Docs; we’ve also tested speech-to-text bots which live inside our computers.

Such bots can actually be very useful, if you’re willing to compromise. In our experience, they are much more than 50 per cent accurate – they type more than half of what you say correctly.

You may end up spending a long time fixing its errors than it would’ve taken you to type everything yourself anyway, but this, in our opinion, is a very important technology.

Natural language processing

The speech-to-text applications depend on what’s known as natural language processing.

It’s a very challenging area for programmers because each and every human person has his or her own way of speaking.

If everyone spoke the same way, perhaps mostly in monosyllabic words forming extremely short sentences, it might make it easier for the computers.

But until that day, computers which attempt to understand humans will continue to struggle.

This area of programming is also thought of as AI. But once all human speech is successfully reduced to an algorithm, perhaps it would be thought of simply as automation, similar to optical character recognition.

A robotic example to all

Speech-to-text is apparently one stage of the natural-language processing system used by Hitachi’s humanoid robot, Emiew.

The software engineer behind Hitachi’s robot, Dong Li, says: “These robots have several stages to transfer a voice into an understandable command.

“The first step is speech-to-text. The second step is to change the text into a meaningful thing that the robot can understand.”

Presumably, the human-generated text would be translated to a binary code which then gets translated back into human language for the robot to speak.

Emiew, like similar robots such as Pepper, Nao and others, tends to be connected to vast cloud computing systems which can help with the natural-language processing.

Google was reported to have developed a system which eliminates the need for one of the stages – the speech-to-text transcription stage.

But even Google’s natural language processing systems are not flawless, which underlines the challenges programmers face in this area, and it clearly marks out the area of competition for tech companies.

“I think that [natural language processing] is the difficult part, so yes, it is the area that companies competing with each other to make better, and it’s something we’re doing as well,” says Li.

Some say the best way for these robots – virtual or otherwise – to learn to understand the human speaking to them is through trial and error. Meaning, the more time it spends listening to one particular human, the more likely it is to learn their particular speaking style.

But relatively few people have the patience to train their robot at the moment, although they might reconsider if the end result was shown to be worthwhile. There’s money in that gold mine.

As if it needs to be said, whoever can develop the most accurate natural language processing system is probably in line for history’s largest fortune.

If a computer could flawlessly convert a human’s speech to text, then it could create entirely new markets and do away with many current technologies, such as keyboards – no more typing.

Natural-language processing also applies to written text, but whether such a system can understand and respond to emails appropriately is a separate matter.

A bot which Microsoft released last year was quickly taught by Twitter users to swear, which led the company to withdraw the bot.

People tend to write differently to the way they speak. Spelling mistakes, diverse grammatical styles, swearing, and so on, not to mention all the other thinking that a human does when dealing with emails is currently beyond computer systems.

Even a straightforward appointments-scheduling bot is not necessarily a straightforward development task, which is probably why companies like x.ai were in stealth mode for so long, raising around $34 million to launch its “intelligent assistants”.

But, as Dennis Mortensen, founder and CEO of x.ai, suggests, there is a lot of demand for such bots – or whatever they’re called.

“We have a very healthy waitlist for the product,” says Mortensen. “It’s clear to us that having an AI assistant take over the job of scheduling meetings removes a lot of very real pain.”

Not just a pretty humanoid 

And while some people might think of humanoids like Emiew and its like as little more than toys, it may be worth mentioning some facts about its longer-established competitor, Nao.

Nao is similar in dimensions and colour scheme to Emiew, and like Emiew, it’s primarily designed for customer service – talking to people who may visit a hotel, shop, transport hub, or any other business where a human person may need to ask a question, perhaps relating to directions to a platform, or when their plane is scheduled to arrive, or where a particular product is in a shop.

Nao is one of the humanoids owned by telecommunications giant SoftBank, which also owns Pepper.

SoftBank says it has so far sold 10,000 units of the Nao robot, which costs almost $10,000 each. That’s obviously $100,000,000 gross revenue from the sales of that one robotic system. Even if bought in bulk at half price, it’s still a colossal amount of dosh.

And Pepper apparently sells even faster – within minutes online every time the company releases a new batch, which tends to be limited in number to about a thousand each time.

From the point of view of a buyer, even if Nao, Emiew and Pepper have difficulties understanding every human that mumbles at them, purchasing and displaying such technology still sends positive signals to a company’s customers, perhaps saying that the company embraces new technology and that it wants customers to enjoy the experience of visiting their store or station, even if it’s just through having fun talking to a robot.

Because, after all, who doesn’t want to speak to a robot?