Converging Technologies Are Catalyst Creating Exciting New Growth Areas in Analytical Instrumentation


Dramatic changes to the fundamentals of analytical instrumentation are currently underway. The traditional approach to measuring, monitoring and gathering information, methods that have developed over the past 50 years, frequently utilize very expensive, highly capable standalone instruments. Most report data via an internet connection or storage device. A few years ago an incremental step was made when wireless reporting capability was added to many of these instruments.


…But with rapid advances in sensor miniaturization and their miniscule power consumption needs, there is a new significant element at play. Distribution. Sensors that are distributed, monitoring, measuring and reporting back to something as simple as a tablet or smartphone dramatically increases the number of data points that can be utilized. Having numerous types of sensors that can share data accelerates analysis exponentially and increases the possibilities of having true real-time data from large geographical areas or highly complex systems.


The resulting systems that will be developed for these applications, new analytical instruments with their distributed sensors and networks, will expand our ability to monitor and assess results or risks often, by providing critical information, such as exposure to radiation or other hazardous material or by showing correlations amongst body systems through body worn sensors providing warnings or improving performance.


Distributed, miniaturized, communications enabled devices open the door to entirely new methodologies and expand analysis models for every industry that relies on instrumentation: ranging from the medical industry to environmental regulations. With the miniaturization of analytical instrumentation and the expansion of the “Internet of Everything” (IoE – refers to the connections between people, processes, data, and things) the next few years will bring the lab out into the world in a way that has never before been possible. Although currently still just in the flirting stage, it will inevitably be a polygamous marriage of big data, miniaturization and distributed analytics that will prove to be highly disruptive to traditional methodologies.


Fields that fare better with broad and varied analysis are also already starting to utilize the structural underpinnings of social networking models so prevalent in this era of Twitter, Facebook and LinkedIn. By implementing the same protocols and employing the tools to create collaboration at an essential level, it will soon be people who become the “nodes” (a connection point, a redistribution point or a communication endpoint) in many respects.


Some of the best analytical instrumentation systems will provide something that the user can benefit from being involved with directly. What is an example of this? Since it has become clear that better than 1/2 of the drugs prescribed to our aging population don’t actually work, and a personalized approach to health care would be much more effective, sensor systems which monitor health parameters such as nutrition, exercise, genetics and so on, that report and also adjust for changes in conditions, is a perfect example of immediate user benefit.


On the other hand, many of these analytical systems will simply gather information in the background without involving the user in any way. An example of that model is the sensor suite on vehicles designed for sensing road conditions. The consumer buys a vehicle and utilizes the information it provides for the purpose of better information about the roads they are driving moment to moment, but other organizations buy the data, for example, to analyze details of weather patterns to be used in an entirely different purpose – for instance to optimize crop growth.


Examples of tools and concepts from the social network model that we will see utilized with these new instrumentation systems include:

  • Immediacy – from real-time monitoring and reporting
  • Data storage – the distributed analytical instrument’s ability to store information for analysis, compilation or proof of compliance and to purge the data when no longer needed
  • Content sharing – not only with the instrumentation’s organization but within the rest of the network
  • Collaboration tools
  • Transparency – so people will know what they are contributing to and who they are engaging with
  • Communications – the ability to share or otherwise participate in the efforts via a variety of formats


Organizations will capture vast amounts of data and use sophisticated analytics to determine not only the measurement or monitoring details of the data needed, but also to define and measure participation, independent uses and connections that will allow continual adjustment of models and procedures – resulting in vastly improved effectiveness.


Delivery of computing resources through a global network is a concept rooted in the sixties, generally attributed to J.C. Licklider. Responsible for the development of the Advanced Research Projects Agency Network in 1969, his vision was for everyone on the globe to be interconnected and accessing data from anywhere. Other sources attribute the cloud concept to John McCarthy a computer scientist who proposed the idea as a public utility also in the 60s.


The most important influence to cloud computing has been the development of easy to consume and reliable “killer apps” from technology titans such as Apple, Google and Microsoft. UK Cloud Computing Pioneer Jamie Turner states, “As cloud computing extends its reach beyond a handful of early-adopter Google Docs users, we can only begin to imagine its scope and reach. Pretty much anything can be delivered from the cloud.”


The benefits cloud computing offers in terms of cost reduction, increased flexibility and storage, make it an essential element in the amalgamation that analytical instrumentation will evolve into because, quite simply it is the only practical way to manage the big data required for capturing, calculating and making use of all of the automated research.

The Strength of the Instrumentation Industry

Continued From

…The largest change to the R&D budget is a 6.3% reduction in the Department of Defense (DOD) funding. The Department of Commerce’s National Institute of Standards and Technology (NIST) netted an increase of $1.428 billion for its primary research laboratories and the establishment of the National Network for Manufacturing Innovation with $1 billion in mandatory funding to promote the development of manufacturing technologies with broad applications.

The 2014 R&D budgets of NIST, the NSF and the Department of Energy’s Office of Science are targeted for doubling from their 2006 levels by the America COMPETES Act, and the America COMPETES Reauthorization Act of 2010.

The Office of Science and Technology Policy asserts that the president’s 2014 budget “maintains the President’s commitment to increase funding for research”. The President’s FY2014 request sets a pace that would result in doubling of the FY2006 level over a period 17 years.

The trending news for technology is that the budget specifically continues support for three multi-agency R&D initiatives in FY2014. $1.704 billion for the National Nanotechnology Initiative, although this is an 8.6% reduction due primarily to reductions in funding at DOD and NSF; close to $4 billion for the Networking and Information Technology Research and Development program, an increase 4.2% over FY2012; and a 6% increase for the U.S. Global Change Research Program.

It is clear that there is a notable interest in ramping up support for emerging disciplines in Science & Technology. The climate is looking great for the advancement of technology and the world is ready.


Advantage Product Development Cores™


Advantage Wireless Controller™ datasheet:


Advantage Wireless Sensor™ datasheet:


Advantage Bluetooth Smart v4.0 (BLE) Core™


Advantage ZigBee to Bluetooth Smart 4.0 Core™