Machine vision vs computer vision vs image processing

We’re often asked – what’s the difference between machine vision and computer vision, and how do they both compare with image processing? It’s a valid question as the two seemingly similar fields of machine vision and computer vision both relate to the interpretation of digital images to produce a result from the image processing of photos or images.

What is machine vision?
Machine vision has become a key technology in the area of quality control and inspection in manufacturing. This is due to increasing quality demands and the improvements it offers over and above human vision and manual operation.

It is the ability of a machine to consequentially sense an object, capture an image of that object, and then process and analyse the information for decision-making.

In essence, machine vision enables manufacturing equipment to ‘see’ and ‘think’. It is an exciting field that combines technologies and processes to automate complex or mundane visual inspection tasks and precisely guide manufacturing activity. In an industrial context, it is often referred to as ‘Industrial Vision’ or ‘Vision Systems’. As the raw image in machine vision are generally captured by a camera connected to the system in “real-time” (compared to computer vision), the disciplines of physics relating to optical technology, lighting and filtering are also part of the understanding in the machine vision world.

So, how does this compare to computer vision and image processing?
There is a great deal of overlap between the various disciplines, but there are clear distinctions between the groups.

Input Output
Image processing Image is processed using algorithms to correct, edit or process an image to create a new better image. Enhanced image is returned.
Computer vision Image/video is analysed using algorithms in often uncontrollable/unpredictable circumstances. Image understanding, prediction & learning to inform actions such as segmentation, recognition & reconstruction.
Machine vision Use of camera/video to analyse images in industrial settings under more predictable circumstances. Image understanding & learning to inform manufacturing processes.

Image processing has its roots in neurobiology and scientific analysis. This was primarily down to the limitations of processor speeds when image processing came into the fore in the 1970’s and 1980’s—for example, processing single images following the capture from a microscope-mounted camera.

When processors became quicker and algorithms for image processing became more adept at high throughput analysis, image processing moved into the industrial environment, and industrial line control was added to the mix. For the first time, this allowed an analysis of components, parts and assemblies as part of an automated assembly line to be quantified, checked and routed dependent on the quality assessment, so machine vision (the “eyes of the production line”) was born. Unsurprisingly, the first industry to adopt this technology en mass was the PCB and semiconductor manufacturers, as the enormous boom in electronics took place during the 1980s.

Computer vision crosses the two disciplines, creating a Venn diagram of overlap between the three areas, as shown below.

As you can see, the work on artificial intelligence relating to deep learning has its roots in computer vision. Still, over the last few years, this has moved over into machine vision, as the image processing learnt from biological vision, coupled with cognitive vision moves slowly into the area of general purpose machine vision & vision systems. It’s rarely a two-way street, but some machine vision research moves back into computer vision and general image processing worlds. Given the more extensive reach of the computer vision industry, compared to the more niche machine vision industry, new algorithms and developments are driven by the broader computer vision industry.

Machine vision, computer vision and image processing are now broadly intertwined, and the developments across each sector have a knock-on effect on the growth of new algorithms in each discipline.

Recent news articles