Big Data is probably one of the most important trends besides digitalization and Industry 4.0 at the moment. So it seems quite obvious to look at photonics, like for example microscopy, through a Big Data lens: amount and size of the data sets generated by that technology are enormous. Digital images, especially in 3D imaging, are a good example. A 3D image with 2000 x 2000 x 500 pixels, 2 color channels and a 16 bit resolution per channel is 8 Gigabytes in size. This alone is not really sensational. But taking into account that this picture is captured 100 times every hour of a given day, you easily come up with 19 Terabyte of data within 24 hours.
Not Big Data yet – but challenges are the same
But are we already talking about Big Data here? Not really. Big Data refers to quantities that are too large or too complex, change too quickly or are too weakly structured to be evaluated manually or with traditional data processing methods. However, the enormous data quantities in photonics we are talking about here tend to be well structured – strictly speaking it is Large Data rather than Big Data. Some challenges, though, are just the same. Even well-structured data raises the question, for example, how to represent it suitably on a storage device so that it can be accessed efficiently for processing and visualization.
Large Data will become part of the Big Data picture
So some solutions, like parallel architectures or smart devices with embedded platforms for greater scalability and higher speed, apply to Big and Large Data alike. And as trends like Industry 4.0 or new methods in healthcare are still gaining momentum, Big Data will, too. This is also true for fields like microscopy or optical sensors in the industry. The data they generate will be analyzed with Big Data methodology, and so Large Data will become part of Big Data strategies and applications.
So the answer is: it is not either or. It has to be Large Data and Big Data in photonics.
For a deeper dive into the topic, read the whole article.