The Strength of the Instrumentation Industry
Converging Technologies Are Catalyst Creating Exciting New Growth Areas in Analytical Instrumentation
Dramatic changes to the fundamentals of analytical instrumentation are currently underway. The traditional approach to measuring, monitoring and gathering information, methods that have developed over the past 50 years, frequently utilize very expensive, highly capable standalone instruments. Most report data via an internet connection or storage device. A few years ago an incremental step was made when wireless reporting capability was added to many of these instruments.
...But with rapid advances in sensor miniaturization and their miniscule power consumption needs, there is a new significant element at play. Distribution. Sensors that are distributed, monitoring, measuring and reporting back to something as simple as a tablet or smartphone dramatically increases the number of data points that can be utilized. Having numerous types of sensors that can share data accelerates analysis exponentially and increases the possibilities of having true real-time data from large geographical areas or highly complex systems.
The resulting systems that will be developed for these applications, new analytical instruments with their distributed sensors and networks, will expand our ability to monitor and assess results or risks often, by providing critical information, such as exposure to radiation or other hazardous material or by showing correlations amongst body systems through body worn sensors providing warnings or improving performance.
Distributed, miniaturized, communications enabled devices open the door to entirely new methodologies and expand analysis models for every industry that relies on instrumentation: ranging from the medical industry to environmental regulations. With the miniaturization of analytical instrumentation and the expansion of the "Internet of Everything" (IoE - refers to the connections between people, processes, data, and things) the next few years will bring the lab out into the world in a way that has never before been possible. Although currently still just in the flirting stage, it will inevitably be a polygamous marriage of big data, miniaturization and distributed analytics that will prove to be highly disruptive to traditional methodologies.
Fields that fare better with broad and varied analysis are also already starting to utilize the structural underpinnings of social networking models so prevalent in this era of Twitter, Facebook and LinkedIn. By implementing the same protocols and employing the tools to create collaboration at an essential level, it will soon be people who become the "nodes" (a connection point, a redistribution point or a communication endpoint) in many respects.
Some of the best analytical instrumentation systems will provide something that the user can benefit from being involved with directly. What is an example of this? Since it has become clear that better than 1/2 of the drugs prescribed to our aging population don't actually work, and a personalized approach to health care would be much more effective, sensor systems which monitor health parameters such as nutrition, exercise, genetics and so on, that report and also adjust for changes in conditions, is a perfect example of immediate user benefit.
On the other hand, many of these analytical systems will simply gather information in the background without involving the user in any way. An example of that model is the sensor suite on vehicles designed for sensing road conditions. The consumer buys a vehicle and utilizes the information it provides for the purpose of better information about the roads they are driving moment to moment, but other organizations buy the data, for example, to analyze details of weather patterns to be used in an entirely different purpose - for instance to optimize crop growth.
Examples of tools and concepts from the social network model that we will see utilized with these new instrumentation systems include:
Organizations will capture vast amounts of data and use sophisticated analytics to determine not only the measurement or monitoring details of the data needed, but also to define and measure participation, independent uses and connections that will allow continual adjustment of models and procedures - resulting in vastly improved effectiveness.
Delivery of computing resources through a global network is a concept rooted in the sixties, generally attributed to J.C. Licklider. Responsible for the development of the Advanced Research Projects Agency Network in 1969, his vision was for everyone on the globe to be interconnected and accessing data from anywhere. Other sources attribute the cloud concept to John McCarthy a computer scientist who proposed the idea as a public utility also in the 60s.
The most important influence to cloud computing has been the development of easy to consume and reliable "killer apps" from technology titans such as Apple, Google and Microsoft. UK Cloud Computing Pioneer Jamie Turner states, "As cloud computing extends its reach beyond a handful of early-adopter Google Docs users, we can only begin to imagine its scope and reach. Pretty much anything can be delivered from the cloud."
The benefits cloud computing offers in terms of cost reduction, increased flexibility and storage, make it an essential element in the amalgamation that analytical instrumentation will evolve into because, quite simply it is the only practical way to manage the big data required for capturing, calculating and making use of all of the automated research.
|Advantage Electronic Product Development Inc.
34 Garden Center, Broomfield, Colorado 80020