Friday 4 January 2013

Manufacturing automation shows us the future of big data in most companies

Here I wrote about big data software techniques as an analogy to manufacturing automation, and then in practice:
http://onelesscut.blogspot.co.uk/2013/01/big-data-software-techniques-in.html

The analogy is perhaps more interesting than the practice. What do robots bring to manufacturing and how does the analogy with big data software techniques play out in the future?

Regardless of repetition, robots bring :

  • Accuracy and quality - they execute repetitive jobs to a high standard with repeatable results
  • Speed and efficiency - they're great at crunching through repetitive tasks
  • Reliability - they're 'always on'
  • Low cost - see Reliability; also they can replace people (hey, it's true) 

They also allow integration up the design and manufacturing process, with the encoding of a physical form into a digital representation with CAD/CAM.

Robots are getting more complex. They're getting smaller, cheaper and more autonomous. They can handle jobs that perhaps required human intervention a few years ago. Most of the advances in robotics appear to be down to advances in software, e.g., signal processing, logic, or whatever.

Big data software techniques are like our robots. They bring:

  • Accuracy and quality - algorithms manage distribution and execution of repetitive jobs to a high standard with repeatable results
  • Speed and efficiency - they're great at crunching through repetitive tasks
  • Reliability - distributed processes give greater resilience but also if you get the algorithm right once, it can be applied to truly massive datasets
  • Low cost - you can do an awful lot with less (smaller, cheaper) computing power, and for some tasks they can replace people manually sifting unstructured data.

Where now?

Well, first off I think we may see Data Scientists being moved off the big data frontline, away from the data itself and back towards widely applicable algorithms. Scientists are usually first in to new areas of learning but they're quickly supplanted by engineers. As it was with robotics.

Once the (software) engineers have got these techniques working for industry, their role will move to be supportive, with end-users taking the lead. As CAD/CAM is a way for a subject matter expert to apply their knowledge of the physical domain so as to optimise manufacturing capability, so big data software techniques will allow subject matter experts to apply their algorithms to improve a process - sales, production or whatever.

That sounds a bit like marketing flannel, so here are some examples:

1. "Hey computer, I'm worried about benzene contamination in my product. Should I be?"

[Computer starts complex, distributed log analysis of a few millions lines of real-time data.]

2. "Hey computer, find out how much product we've had to flare off and how much it cost."

[In the future, everyone says 'Hey, computer'. Computer finds all the flaring incidents, exactly how much product was sent to flare from where, the value of each product.]

3. "Hey computer, can you reduce my electricity bill?"

[Computer looks at efficiency of every component in a process, tries to optimise usage taking into account the effect on other parts of the process.]

This is elegant as it doesn't need a multi-petabyte dataset for these techniques to show value; it's about using the existing data, reforming it and translating it into new forms on the fly through algorithms.

Ultimately this feeds right back up to the design process for new facilities, processes and even businesses. As Google extends each of our knowledge and even memories, we'll rely on algorithms chewing through lots of data to be our enterprise memory.

No comments: