Tech time: Robots, AI and drones shake up fashion

Louis Vuitton SS16

Experts predict a revolution in the next 10 years with robot sales assistants, robot manufacturing, chatbots and AI working their way into every area of the fashion sector. Are you ready?

Not so many years ago, the height of retail tech was a sales assistant with an iPad. But technology moves fast and yesterday’s cutting edge innovation has become today’s same old, same old. So what tech is shaking up the business world today and how might it affect the fashion sector?

Think robots, drones, machine learning/artificial intelligence, virtual reality and chatbots. All of these make iPads in-store feel almost Stone Age.

Some of the latest developments are ultra-high-profile. Take Nicolas Ghesquière’s definitive statement about what the future might hold for Louis Vuitton’s spring/summer 2016 ad campaign. He picked a video game “virtual” face, Final Fantasy XIII’s Lightning, as his campaign star. The kick-ass heroine of the game series, looked as good as any supermodel in the shots and guaranteed no hissy fits too.

But virtual reality models aside, some of the most interesting developments today are happening around robots, even though it’s undeniable that the fashion sector is a bit late to the party.

Louis Vuitton SS16: Final Fantasy VIII's robot star
Louis Vuitton SS16: Final Fantasy VIII’s robot star

I Robot

The car industry has been using robots for decades. But then welding or sticking large, hard bits of metal and plastic together is a lot easier than manipulating soft, often tiny and stretchy bits of fabric and it’s taken until now to overcome the challenges.

Last December though Adidas unveiled a prototype robot factory to make sports shoes and just a few weeks ago announced it will start production next year. The plant is near its HQ in Germany and marks a step change in a trend that has seen manufacturing moving eastwards for the past two decades. Apparently, Nike is working on its own robot factory too.

Adidas
The Adidas Speedfactory

Now, the Adidas robots won’t look like extras from the cast of I Robot sitting at machines. They’re much less humanoid than that. In fact, early imagery shows what seems to be giant robot arms.

Adidas says the aim is to get production of certain items closer to the places where its customers are and to offer those consumers greater personalisation. In a way it’s a mechanised version of Inditex’s strategy where some items are still made in Europe in order to respond more quickly to trends.

It means trend-driven products that can be designed, made and shipped ultra-fast. No surprise then that Adidas has named the new manufacturing plant Speedfactory. It plans to open more of them in Germany and probably in the UK, France and US too.

Smaller production runs are also feasible and in fact, we’ve already seen how Nike’s Flyknit tech can reduce the need for shoes to be made in multi-sections, which could even mean small-scale robotic production units in stores.

Now while such tech is obviously loaded with potential, its also generates concerns and one of the biggest is that it could decimate jobs. However, Robot factories are unlikely (for now) to put large numbers of low-paid workers in Asian factories out of work. Adidas has already said that rising demand means it needs to increase its factory space every year anyway and its robot factories will fill the gap with no need to cut back elsewhere.

US company Softwear Automation echoes this. It’s one of the biggest names in robot apparel manufacturing and has been surprised by just who has ordered its machinery, having cracked the problem of robots handling soft, small and sometime stretchy pieces of fabric without distorting it.

It expected orders from US cut-and-sew firms that wanted to bring manufacturing closer to home. What it also got were orders from Asian factories that were struggling to find skilled workers and the company is predicting fully automated factories being widespread within a decade.

Pleased to meet you…

So if manufacturing will be revolutionised within 10 years, how about other parts of the fashion sector? Where robots are likely to maker a much earlier and more visible (to consumers) impact is in areas where they don’t have try to stitch multiple pieces of fabric together without making a mess of it.

That means, in warehouses, in delivery, in call centres and yes, even in stores. Here the simple function of saying hello to customers and offering limited information is the start of a robot journey that will likely dive deeper into a variety of in-store functions in the next few years.

Early last year, the Tokyo’s Mitsukoshi Nihombashi department store had a guest receptionist, a lifelike android by Toshiba called Aiko Chihira who was there to give directions to customers. And later last year Toshiba unveiled a more advanced version, called Junko Chihira, who by now was multlingual and acted as a guide in a tourist centre. By 2017 Toshiba says its bots will be able to answer customer questions as well as offering up pre-set information and by the 2020 Tokyo Olympics it expects to have a full cohort of robot tour guides.

Such humanoid-but-not-human-enough bots are fascinating and would certainly be a draw in-store. Yet they also have a slightly creepy edge. Scientists says that’s because our brains recognise them as human but also know that there’s something not right about them. Which is perhaps why cute-but-obviously-robotic bots might be more appealing to the average shopper.

Just look at how appealing confused.com’s robots have been. They’re very cute and are giving comparethemarket.com’s meerkats a run for their money.

SoftBank’s Pepper, the super-cute obviously-not-human robot is already being touted as a future in-store sales assistant. We’ve seen one example of Pepper running a mobile phone store single-handedly (one customer even walked out having bought her own Pepper). And another where SoftBank teamed up with MasterCard to allow Pepper to process payments at Pizza Hut. OK, using robots to sell tech and process payments is a logical step. But while Pepper can’t yet make the leap to fashion and offer up opinions like “that dress looks great but perhaps a size 12 would fit better,” that will come.

For now, fetching, carrying, payments, offering information. Those are obvious functions that robots can deal with very efficiently and tech firms are putting plenty of investment into this area. Hitachi, for instance, is developing a humanoid robot to work in-store and that can actively approach customers, although again, its main function is to offer pre-set information.

Hitachi’s EMIEW3 is the latest iteration in the EMIEW series that launched as far back as 2005. Today, it has a remote brain connected to cloud-based intelligent processing systems, can communicate with other robots and can even stand up again on its own if it’s knocked over.

Hitachi's EMIEW robots
Hitachi’s EMIEW3 robots

Taking shopper-robot interaction further, we’ve already seen the ultimate in robot bling with one wealthy shopper in Guangzhou, China being accompanied in and out of luxury stores by a coterie of robot “maids” who waited patiently while he shopped and dutifully carried all his purchases for him. But while that rather tasteless display elicited a storm of disapproval on Chinese social media (both for the excessive display of wealth and the sexism as the robots were obviously female), it did demonstrate yet another way in which robots will interact with people around retail.

Drones and deliveries

And then there are drones. Early forms of this particular tech (outside of military use) did little more than fly around. But only this month Wal-Mart said it’s testing the use of drones and AI in its warehouses, using drones to identify stock issues in a single day in a process that takes around a month using people.

Amazon is also testing drones as part of a major investment in robotic technology and the company already has around 30,000 non-humanoid functional robots performing warehouse tasks. They operate alongside human employees with current thinking being that people and robots are complementary. Real humans are still needed to do the more sophisticated thinking and perform the more detailed tasks that robots can’t.

https://www.youtube.com/watch?v=quWFjS3Ci7A

But as each year goes by there are fewer tasks that robots can’t do. Both Wal-Mart and Amazon are looking at drones and robots for delivery of e-commerce transactions, for instance. While drones could be slow to take off (excuse the pun) here as the authorities in many countries simply just don’t want too many robotically-controlled devices flying around, ground-based robots are showing real potential for last-mile/kerbside delivery.

Starship Technologies, a company owned by the co-founders of Skype, is testing self-driving delivery robots in the UK and US. They make local deliveries and can complete them within 5-30 minutes from a local hub or store with the cost being less than 10% of current last-mile delivery alternatives, the firm claims. They offer a window for delivery slots that’s smaller so customers don’t have to wait in so long for deliveries and shoppers can track their location via an app.

For security, only the customer can unlock the robot once it arrives at its destination while as a would-be-thief deterrent, the robot has multiple on-board cameras and GPS that’s connected to the internet so an operator can speak to people nearby.

Starship Technologies' delivery drone
Starship Technologies’ self-driving delivery robot

AI, data and chatbots

Perhaps the most practical revolution that could affect fashion though is actually more “invisible” robot functionality and artificial intelligence. That means robots/AI inside tills and stockroom computers, analysing sales, gathering data and sharing insights with real live humans to guide decision-making around things like buying and marketing.

Guess, for instance, is the official launch partner this summer for a start-up’s new service that aims to track what web browsers are looking for and offer it up in their local store. Radisu8 is a Cloud-based platform that can collect local data around what web browsers are searching. So if lots of people near a particular Guess store are searching for distressed skinny jeans or camo bomber jackets, the retailer can make sure it’s not only got those items in-store but is promoting them heavily in that local area. In another area, it might be more about Bardot tops and ankle strap sandals. It allows giant multinationals to localise shopping.

Radius8 CEO and co-founder Sandeep Bhanote says fewer than 5% of online searches end up as sales at the moment so converting web browsers to physical shoppers might be a very effective way of maximising online activity and “will help make the retail store relevant again.” He also says the tech could “more effectively monetise local store inventory, and minimise the margin-eroding discounting and markdown tactics widely employed today.”

It’s a great example not of big data but small, local data that could have a big effect on the bottom line.

Companies can also use AI systems to advise them on where to place their media spend most effectively or that can use “visual listening” to identify images used in social media (even without accompanying text) that might be relevant to them so they can advertise where these images are being seen or gain new insights into potential customers.

Another example of “invisible” robotics is in virtual “call centres”. Chatbots have been very much in the news this year after Microsoft’s Tay chatbot used machine learning to pick up on the subjects people were discussing on social media and then join in their conversations. The only trouble was these social media chats were often racist or sexist and Tay innocently joined the party!

Tay may have been quickly withdrawn but Microsoft, Facebook, Instagram and more are all heavily committed to chatbots and so are many more retail businesses. At the moment, their scope may be limited and the conversations they have may be frustrating (my personal experience is that every question I’ve ever asked has come back with a pretty irrelevant answer).

But the tech is becoming more nuanced daily and a time will come when it’s clever enough to be indistinguishable from a conversation with a person.

Do that and all the other innovations detailed here excite you or make you feel uneasy? Chances are it’s a bit of both. But while many ethical and logistical issues remain, there’s no stopping the tech. As they used to say back in the 50s and 60s, “that’s progress”. We all need to make sure we’re prepared for it.