Skip to main content

Software will power the Internet of Things

Today’s connected world is moving from devices towards things, what this means is that by using increasingly low cost sensors embedded in devices we can create many new use cases. These span across use cases in cities, vehicles, home, offices, factories, retail environments, worksites, health, logistics, and health. These use cases rely on ubiquitous connectivity and generate massive amounts of data at scale. These technologies enable new business opportunities, ways to optimize and automate, along with new ways to engage with users.
These technologies have been enabled by a perfect storm of technologies converging. They include both hardware, transport, and analytics.
  • Inexpensive sensors – As highlighted by this research by the ITAC http://itac.ca/uploads/events/execforum2010/rob_lineback_10-6-10-2.ppt sensor prices have been continually dropping.
  • Ubiquitous internet access – Powered by the pervasive mobile technology devices and sensors can be connected to the internet. Less than a decade ago, the powerful computers in our pockets did not exist. Powered by standard connectivity and protocols such as Bluetooth, Wifi, and NFC driven by protocols such as Zigbee and ubiquitous APIs.
  • Cloud technology – High speed on demand processing, storage and capabilities enabled by public cloud create the backbone for information collection and analysis on demand. These resources and platforms are easily accessible to all to collect data and provide insight into the usage of the thing.
What is the glue which makes all of this possible? Software is the key to IoT, this makes everything function together and creates these new capabilities and opportunities. This is why we believe seeing inside the software is key to visibility for purposes of troubleshooting and creating insight into the IoT. The complexity and scale issues presented by IoT on both the backend (in the cloud) and the frontend (things themselves) is a major challenge for not only the systems themselves, but for the management tooling of these interconnected and fluid systems.
Other key cautions for IoT include:
  • Software is not being managed properly, both in terms of availability and performance.
  • Data ownership of the data collected and mined.
  • Data security of collected data which can be used for malicious purposes.
  • Battery technology has largely not evolved for 15 years or more, this is a major limitation to today’s devices and connected things.
As a company AppDynamics believes that IoT will be a key part of computing and interconnected systems of the future, our customers are increasingly applying our technologies to these use cases, and we look forward to becoming an integral part of both collecting and analyzing data within these systems.

Comments

Popular posts from this blog

Dynatrace Growth Misinformation

For my valued readers: I wanted to point out some issues I’ve recently seen in the public domain. As a Gartner analyst, I heard many claims about 200% growth, and all kind of data points which have little basis in fact. When those vendors are asked what actual numbers they are basing those growth claims on, often the questions are dodged. Dynatrace, recently used the Gartner name and brand in a press release. In Its First Year as an Independent Company, Gartner Ranks Dynatrace #1 in APM Market http://www.prweb.com/releases/2015/06/prweb12773790.htm I want to clarify the issues in their statements based on the actual Gartner facts published by Gartner in its Market Share data: Dynatrace says in their press release: “expand globally with more than three times the revenue of other new generation APM vendors” First, let’s look at how new the various technologies are: Dynatrace Data Center RUM (DCRUM) is based on the Adlex technology acquired in 2005, but was cr...

Misunderstanding "Open Tracing" for the Enterprise

When first hearing of the OpenTracing project in 2016 there was excitement, finally an open standard for tracing. First, what is a trace? A trace is following a transaction from different services to build an end to end picture. The latency of each transaction segment is captured to determine which is slow, or causing performance issues. The trace may also include metadata such as metrics and logs, more on that later. Great, so if this is open this will solve all interoperability issues we have, and allow me to use multiple APM and tracing tools at once? It will help avoid vendor or project lock-in, unlock cloud services which are opaque or invisible? Nope! Why not? Today there are so many different implementations of tracing providing end to end transaction monitoring, and the reason why is that each project or vendor has different capabilities and use cases for the traces. Most tool users don't need to know the implementation details, but when manually instrumenting wi...

IBM Pulse 2008 - Review

I spent Monday-Wednesday at IBM Pulse in Orlando. It was a good show, but quite a few of the sessions were full when I arrived. It was frustrating because they didn't offer them more than once. The morning sessions were mostly pie in the sky, and not very useful to me. I got to spend a lot of time with senior people in engineering, architecture, and acquisitions/strategy. I also got to meet people I knew from online or other dealings with IBM. Overall, the show was a good use of my time, and I found it enjoyable. Here are some of my highlights: ITM 6.2.1 improvements including agentless capabilities and such. New reporting framework based on BIRT which will be rolling forward. New UI which is being pushed and was on display from TBSM 4.2. Hearing about what other customers are up to (mostly bad decisions from what I've seen). Affirmation of ITNM (Precision) as a best of breed tool, with a excellent roadmap. Some things which are bad and make no sense: Focus on manufactur...