How Secure is your API?

By Sairam Bollapragada & Sandeep Mehta

Technology will keep evolving and the existing platforms will keep transforming to make our solutioning richer with far more reaching impacts. API is the evolving technology glue which is promising strategic and much higher complex communications between the applications with many of the technologists vouching for them.

Just to strengthen the case and context here, Ovum Survey mentions that 30% of APIs are designed without the inputs from the infosec teams, 27% proceed with the development without security teams weighing the same, while 53% of IT/security professionals feel security teams should be responsible and 47% feel it should be developers. Of the API platforms used by companies (borrowed or in-house developer), only 22% had protections from 4 critical attack parameters – developer errors, web/mobile API hijack, automated scraping or malicious usage). 45% of the API Management platforms are infested by the rate limiting features. The list and arguments go on.

What are the threats to an API Platform

There are some jot-it-down parameters which should be hard-looked at whenever an API based solution design becomes a suspect.  Let us look at them and ponder on detections or workarounds.

  1. Unprotected APIs: API’s should have prioritized insulation built around them. REST, SOAP,  make access strictly controlled as back-end systems lack access control, management and monitoring. Exposed APIs must be dynamically scanned to ensure system exposures to unprotected assets via APIs are identified whenever there are requests made for access through the API layers.
  2. Hack-in Attempts: The attackers have high sustenance to break into the systems with spectrum of persisting techniques. Effective usage of rate-limiting services via API gateways to choke access requests and detect break-in patterns will help, upfront security testing and policy designs to block out users with patterns of malicious failed break-in attempts will be good strategy.
  3. Injections: High Impact attacking techniques like the SQL injections can become a most serious security failure with all your information compromised. One of them could be for output data written back to the API caller, is the source of data authentic and how is the encryption taking place? What is the extent of the data user control? Etc. This being a very vulnerable part, tools like SQP Map for testing the SQL injection, Burp Site for the same or even cross-site scripting are useful to prevent such threats.
  4. Strong Authentication!: APIs are designed to be exposed for external usage and hence every caller should be authenticated. The authentication cycle must be completely audited and checked from request initiation to termination using approved authentication standards. Application level testing to ascertain weakness of API approved authentication protocols would greatly assure validation of calling applications token.
  5. Session depravity: Not knowing who is the caller of API when the tokens are corrupted or cannot be authenticated will deem it impossible for API servers to differentiate between well-intended and ill-intended access. Tokens when tampered or replayed with altered privileges can create such a scenarios. Token-protection schemes like hashing and ensuring tokens are fresh using verified timestamp will help. A test suite developed to check token tampering is identified/tracked, or only accepted from authorized sources is mandated.
  6. TLS/SSL Protections: TLS/SSL Protection makes sure that data transferred between users/sites or between two systems becomes impossible to read. It uses  encryption algorithms to scramble data in transit.
  7. Trusted V/s Trustworthy: A trusted computer system is a system that is trusted to perform security- or safety-critical operations, a trustworthy system is a system which has already been trusted and secured using encryption methodologies.
  8. API Right Usage: An API should be used for what it is designed for. Many times the implementation on the API platform exceeds the functionality available on the platform. This exposes the whole platform to new set of risks. The limitations of an API platform have to be strongly kept into consideration whenever it is being evaluated for any solution. Also one fact to be kept in mind is that the API should peacefully Coexist
  9. Poor Code: Poor Code on an API exposes the platform to lot of vulnerability, examples of Poor Coding on API are not implementing certificate based authentication and not restricting IP addresses to filter only from known sources. This will expose the API platform to all external IP’s and anyone with basic skills can access the API and perform their operations.

Finally, whenever an API is being designed or being evaluated for usage, the base security parameters have to be sufficiently considered and evaluated. The evaluation or design of the product should always keep enough room to extend security features as and when required. There should also be enough facilities to implement remedial measures or alerts which will alert the users whenever there is any security threat or breach of the API.


VUCA in the Digital world!!

VUCA in the Digital world!!

By Sairam Bollapragada & Rajesh Mohandas

Across the globe, all are now connected in unprecedented ways. This is both a boon and a bane, where we live in an era that is transforming and setting stage for the next revolution. Times when we were disconnected and every country operated in silo the challenges were limited to the internal affairs and the near border conflicts only.

With technological advances where today we look at a bright and secured future on one hand, on the other hand the unrest continues and is growing bigger day by day, conflicts, civil unrest, terrorism, ransom ware, cyber crimes, etc… are now integrated into our daily life.

The digital reality is shaking up some of the beliefs and compelling us to move to a more knowledgeable IT economy what with automation and AI which were limited to books, have finally come to the open challenging how that can transform every space of the life. Soon all white-space is expected to be filled with cognitive behavior and techniques. Automation is forcing re-wiring of skills for many of the IT workforce (read : ) spelling end of the careers if not done.

Hence one can relate to the 4 key parameters of VUCA : Volatility, Uncertainty, Complexity and Ambiguity. Each of these factors are challenging the order of the day stuff and hence the need to cope with the same in the turbulent times.

The compounded problem statement with external influencing factors from market pressures, competition, shareholder expectations, stakeholders, are strong indicators, to the fact that the leaders will need to be hard wired to resilience.

The role of the leaders managing workforce, will be crucial and critical in shaping the digital future of any organization.  Most of the requirements to support a digital environment are not about the technology per se, but it is also about creating the environment to re-skill, create flexibility to be agile, adopt to changing demands, and groom the right talent for a safe digital future.

Let’s take each of the parameter at a time to see what it means in Digital world:

(V) Volatility: The nature and Dynamics of change that is blowing across the landscape mandates catalysts to adopt to these changes. The legacy of efficiency and productivity will no longer continued business anymore. Disruptive innovations are indeed unsettling dominant industries in today’s world. Hence the times call for compulsive innovation and a drift away from SOPs.

U (Uncertainity): This is a factor which reflects the lack of predictability and many surprises. Another indicator of this is the refusal of the current technology wave to move easily beyond the labs. The ever-experimenting mind-set is also reflecting that the solutions themselves are prone to obsolescence, from the very moment they are conceived with high degree of unpredictability.

( C)Complexity:  Multiple parameters built into the character of the issue spells complexity – be it chaos or confusion-led issues.

Complexity can also reflect multiple influencing factors which can unsettle easily. Complexity is good or bad depending on your strategy. Having a bullet proof strategy is impossible – nevertheless one should have a solid strategy to counter complexities and challenge the same.  Even if it comes with short expiry date (2 years) you should have one.

Digital space is getting more and more complex with each passing day rolling out a new platform, new innovations coming to light, new solutions offered, disruptive models coming to life, etc. Hence to deal with all these changes, a strategy for managing this change is mandatory and thus the

(A)Ambiguity : The fact that we only know 40% of how technology will fold into the lives and markets as an influencer, is a true reflection of haze in the Digital space. This then raises the question of business risk, which is quite a reality today.

At various levels of an organization, there are ambiguities relating to progression and growth, whether at organization level or career levels of professionals.  Except for the lexical meaning of the Strategic and Tactical approaches, the lines are thinning out.

Volatility, Uncertainty, Complexity and Ambiguity will continue to exist but what leaders today can do is to play a vital role and attempt to control the levers by moving in to a Hyperawareness zone of informed decision-making, and fast execution. Winning in the Digital Vortex is not just about algorithms, architectures or innovative business models; it requires organizational change and workforce transformation. And successful transformation is enabled by a company’s digital business agility, building on the fact that people are an organization’s most important asset. Hence, everybody is but compelled to think on the forward thinking strategies to adopt to the Digital VUCA scenarios….

The Need for Intelligent Command Control Center for Robots (IC3R)

By Sairam Bollapragada & Rajesh Mohandas

It is predicted that the industry economy whether IT or non-IT, will go full throttle in the upcoming FY 2017-18 to create a financial realization called autonomics – unlocking the potentials  of robots that are being conceived. Over 2.5 billion people have at least one messaging app installed, within a couple of years that will reach 3.6 billion, about half of humanity. (Source: The Economist) However, the outcomes as suggested by many big market research houses have not been up to the desired expectations. With the things heating up around automation and artificial intelligence/RPA, we can foresee that we will be very soon seeing an increasing need to have some solid controls in place.

Today, the market is focused on Industrial Networks, Industrial Robots, Machine Vision, Control Valves/Devices, Field Instruments, Enclosures and Cables.  Each of these components have an IT and a Non-IT element with technology landscape consisting of SCADA, PLC (Programmable Logic Controllers), PAC (Programmable Automation Controllers), RTU(Remote Terminal Units), DCS (Distributed Control Systems), MES (Manufacturing Execution System), PLM (Product Lifecycle Management), HMI(Human Machine Interface), and above all Safety.

While creating new technical solutions every day and getting excited with it, we are probably too casual on the flip side of the consequences. Lets focus on the negatives for a moment – what if an unmanned vehicle had a bad bug? or what if the programming in the automated manufacturing plants were intercepted/hacked altering the desired behavior or leading to disturbing outcomes?, it can become a nightmare!! For example a recent crash involving Uber Technologies Inc. driverless car suggests autonomous software sometimes takes the same risks as the humans it may one day replace. While we are creating bots at an unprecedented speed and passion, we may also need to secure these advancements through a control mechanism, which will help us to have the desired outcomes, intact. The technological singularity will compel us to start thinking on automatic recovery with deep machine learning capacity.

Hence, are we talking about having a Command Control Mechanism to protect the desired outcomes of all the automated bots whether Soft or hard? The answer is yes. We need to soon develop and establish command control centres for a set of digital work force you want to monitor on a continuous basis to ensure they are aligned to the expected behavior patterns. In fact, there should be a proper set of guidelines issued by the state agencies before allowing any robot to go commercial in the market. The audit and strictures will help control the release of un-certified or Rogue robots. This would be especially true with the craze of smart cities catching up like around the globe. The creation of the digital twin space is also something that must be looked into seriously for potential disruption.

A command control center will help in creating a centralized monitoring service which will track monitor and report the behavior of these bots while positively looking at it, it could also lend performance improvements towards the desired outcomes. What with the introduction of aggressive mind-control technology and Drones we should have a proper access control on this technology based robots. A C3 with an end-to-end visibility across robots with real-time rolling view to help us have a central control of work schedules, job cards, execution, and support for various robotic activities

While the support for high availability/disaster recovery and network load balancing is the intent, the central control mechanism, will be mandated to have a cyber-cop kind of functionality. For example, while monitoring the bunch of UV Cars, suppose an unmanned vehicle on road was malfunctioning, one should have the ability(or create one) to monitor it in real time and stop the functioning of the engine remotely to avoid any major disaster.

A secure central monitoring system laced with analytics, could be enabled through the log base where robots pass on every information pertaining to each activity they are instructed to perform. With this much of an information being logged, one can get a deep insight into the business and the activity patterns being conducted by or through robots. With so much of information at our disposal one can really create a very good analytics use case to understand and comprehend the behavior of these robots as they are unleashed into the market.

The global industrial automation market is extremely fragmented due to the presence of several players in the global market. Some of the leading players operating in the global market are ABB Ltd., FANUC Corporation, Honeywell International Inc., Toshiba Machine Corporation Ltd., Yokogawa Electric Corporation, Emerson Electric Company, General Electric Company, Yaskawa Electric Corporation, Rockwell Automation, Inc., Mitsubishi Electric Corporation, and Voith GmbH.

However, while doing a cherry pick of the best of the lot or robots to make their organizations more productive and efficient, we hope that the focus will begin from creating a solid Intelligent Command Control Center upfront to monitor, maintain, track and continuously do course correction for these disparate bots – soft and hard alike. The industrial control and factory automation is projected to reach USD 153.30 Billion by 2022, at a CAGR of 4.88% during the forecast period and hence the emphasis. The state agencies must work towards evolving policy guidelines within and beyond for all entities looking to employ the automation-Digital bots effectively.

Quantum Computing – expectations in Digital Era !!


The advent of digital in all facets of life is increasing the demand for better, faster, stronger, reliable and safer infrastructure needs of a common man, who is using the social media, the networks, the mobility and shared cloud services implicitly like never before.

If any, these demands will only go northwards, some of the areas which were theoretically in picture thus far have started assuming practical shape, like quantum computing and artificial intelligence.

The IT Headaches are increased multi-fold due to delays. 55% of IT Professionals experience downtime, causing unproductive hours; 24 petabytes of data processed by google daily; 2 hours of video uploaded every minute at YouTube; 2.9 million mails sent every second; 375 MB of data consumed by households each day; 72.9 items ordered on Amazon per second…. and still the demand for more speed and more capacity never ceases!!!

Human brain has 100 billion neurons, each neuron connected to 10,000 other neurons, this is the most complex and complicated computer known to the universe connected to the super conscious. Today going beyond the binary world of Zeros and Ones we learnt in our early days of computer education a new dimension of multiple zeros and multiple ones co-existing together and intelligently making decisions assisting the GenZ live in a gadget friendly world.

Computers are doubling in power every 18 months, states Michio Kaku (an American theoretical physicist, futurist, and popularizer of science, Kaku is a professor of theoretical physics at the City College of New York and CUNY Graduate Center), but this will not go on forever. He predicts the silicon revolution to be over within the next 15 years. That is why quantum computing is the future of technological advancements.

Quantum Computing that was conceptualized as early as 1980’s is still in its infancy in 2016. However, while practical and theoretical research continues, the fact that it has been made into a reality in some of the labs and commercially affordable limitedly at a high cost is breathing hope for the expectations. In the realm of Quantum Physics particles can act as waves or particle or particle and wave (Super position).

In Superposition a Qubit can be 0 or 1 or 0&1 which means it can perform 2 equations simultaneously, hence 2 can perform 4 equations ,3 can perform 8 equations, hence 2n exponential expansion of simultaneous equations continues…leading to unimaginable large numbers of working concepts.

These concepts are converting to reality fast with companies like D-Wave building the world’s first real quantum computers (with closed environments with limiting dimensions of light, temperature and cost). The Digital space reflecting the need for near-real time scenarios, can leverage and become an excellent life changer. While we are bringing in game changers with the advances in Digital areas, Quantum computing can lend a huge impetus to the entire phenomenon if it were included in the strategic planning of the digital solutions.

The expectations from Quantum Computing can be set to bring in huge transformations in areas as below (though the areas mentioned are just a few of so many):

  1. Artificial Intelligence: A Chinese team of physicists in 2014 trained a quantum computer to recognize handwritten characters, the first demonstration of “quantum artificial intelligence”. The software that Google has developed on ordinary computers to drive cars or answer questions could become vastly more intelligent with usage of QC. It is proven that quantum effects have the potential to provide quadratic improvements in learning efficiency, as well as exponential improvements in performance for short periods of time when compared to classical techniques for a wide class of learning problems. All solutions based on cognitive techniques would also get hugely benefited from the QC. The primary aim of the cognitive sciences is to provide explanations of important mental functions, including perception, memory, language, inference, and learning.
  2. Cryptography: This field is gaining increasing significance by the day in the background of Digital solutions. Since a quantum state changes when it is observed, it is possible to design ways of ascertaining whether a person has eavesdropped on a message which can lead to real time detection of breaches. Using that method people can send each other encryption keys— strings of symbols that are used to encrypt and de-crypt messages — and be sure they’ll be alerted straight away if someone has intercepted the key.

The technological potential for quantum computing was first realized in the formulation by Shor (1994) of a polynomial-time quantum algorithm for the problem of factoring a number into its constituent primes, for which the best classical algorithms require exponential time.

The apparent intractability of prime factorization under classical computation is central to the security of the cryptographic schemes prevalent today. If implemented on a suitable quantum machine, Shor’s algorithm could potentially break the encryptions depended on by governments, banks, and millions of individuals.

  1. Cyber security: wireless networks, shopping or paying bills online, logging into password-protected Web accounts, and toting always-connected mobile devices presents a constant security threat. The overwhelming computational abilities accessible upon the arrival quantum computers would quickly unravel the most advanced public key encryption available today, leading to the current cryptographic standards getting obsolete. If quantum computers appeared as a viable technology tomorrow, there would be precious little alternative and acceptable means for securing our online and wireless transactions as protecting information and computation processes is the backbone of the security which converts your message into a secure format that can reach its destination without being read by an eavesdropping party.
  1. Economics: Supercomputers were very talked about in the real time economics scenarios in the 1970s and it did get its due outcomes. However, with Quantum computing the game theory will become more intriguing and interesting. Computational thinking is transforming economics, spawning a new field of computational microeconomics
  1. Medicines: Qubit is expected to transform the Magnetic resonance imaging (MRI) that can be looked upon for study of smaller body cells with high precision. Another area where there is large expectation in bio-medical is the vast DNA sequencing improvements where quantum computing can help large scale processing of data to bring it to clinical use.
  1. BI and Data Analytics: high performance computing has always been an expectation in this field and the better the machines, the better the data processing capabilities improving our decision making abilities. Quantum computing has put the imagination of speeds of faster data processing to new heights using HDFS and MapReduce. Complex and complicated Data – structured or unstructured, can be expected to be treated with ease as Qubits superposition characteristics are leveraged.
  1. Agriculture: owing to the genomics-based data explosion and concurrent computational advances in technologies, the ICAR in India claims a paradigm shift in Agricultural sciences. The better and bigger data processing capabilities that can bring in faster research advances in bioinformatics, be it modelling of cellular functions, metabolic pathways, validation of drug trials and targets to better genetic engineering of crops for better yields of better crops, genetic networks, etc., throws in a hope to stop the calamities due to crop failures in rural areas of developing nations. Here again the QC can play a very critical part as well as in the precision agriculture – a concept leveraging digital solutions fast expected to reach a market of 4.55$ Billion in 2020.
  2. Logistics: A new optimization algorithm was devised for logistics companies by Manchester Metropolitan University to calculate the best routes and times to send vehicles on the road in the most efficient way. The code, a Quantum Annealing Optimization algorithm, is one of the first in a new generation of optimization techniques, which could revolutionize logistics for businesses and a range of other applications.

While Quantum computing is more about converging to solve optimization problems, the application domains can vary in spectrum – be it finding the most optimized solution to power consumption, In artificial intelligence, economics, genomics or medicine, etc. The application of Quantum Computing can also be looked at solving some of the complex machine learning where building models to make accurate predictions.

(At an estimated 15 million $ price tag a handful of organizations like Google, NASA, Amazon, Lockheed Martin have emerged as forward thinking organizations, who are ready to bet on future technologies.)

Towards a better hope, bigger future!!

IoT Security is everybody’s business!! – Part 2

  • By Sairam Bollapragada

We identified the risks and potential threats to our living in the part 1 of this blog. Let us discuss some of the preventive ways to secure our living in this part – some remedial steps which will help repose faith in the technology driven lives.

A study by Hewlett Packard shows that around 70% of the connected devices are prone to serious threats.  Many of the consumers of technology, roughly more than 76% do not understand or appreciate these risks. The attitude is – “.. it has not impacted me so far…”.

To deal with, let us identify the top 10 security issues with IoT to increase our awareness. These could be potential sources:

  1. Insufficient authentication or authorization
  2. Insecure Web interface
  3. Insecure network services
  4. Insufficient security configuration
  5. Privacy concerns
  6. Insecure mobile interface
  7. Lack of transport encryption
  8. Insecure software or firmware
  9. Insecure cloud interface
  10. Poor physical security

The above list, though not exhaustive, is definitely worth pondering.

All organizations rallying to be the top IoT product and solution providers must compel themselves to create the hard security platforms which will make the solutions bullet-proof for any vulnerability resulting thereof.

While everybody would love to believe prevention is better than cure, we cannot ignore detection and detention of rogue application creators/hackers/disruptors and the havoc-makers. The cyber laws of all lands embracing such technological progress (leaves none untouched though), need to be made more stringent, detectable, with outcomes for prevention. A new brand of Cyber-cops will need to be constituted – who have in-depth knowledge and technical capabilities (rather extensively trained) to

  • Comprehend the types of crimes that can be committed
  • Analytical skills to trace the equipment(s) used for the crime
  • Understand the device characteristics with potential vulnerable points
  • Analyze the data getting generated through millions of devices
  • Profile the device types used in the crime
  • Understand data privacy laws and detect the extent of damage
  • Complete understanding of compliance laws of several vertical industries (like BFSI)
  • Most of the categorized IoT devices used in solutions
  • And many more

What I am indicating is that Cyber police can no longer be a selective location based optimized teams in a police station, but proper networked teams who have extensive tech knowledge of the field. They must be equipped with applications and mechanisms to establish crime patterns and behavioral trends of typical class of the crime being committed (periodically?). These can also be virtual teams which can work on distributed patterns but build a virtual cyber security data center – with enough potentials and credibility to nip the crime in the bud bringing speed and effectiveness into the crime scene.

While preparing for this so-called 3rd Industrial Revolution, the policy makers must get into following actions as part of readiness:

  • Defining and designing cyber threat intelligence (CTI)
  • Defining Cyber security ecosystem including suppliers, partners, vendors, business networks
  • Cyber cells must be formed at each department of the citizen service to create preventive mechanisms for tracking cyber-crimes, and intervention at greater speeds
  • Creating a level of understanding among the organizations for strong governance, controls and accountability
  • Enlisting high valued assets(buildings, transports, Physical data centers among many) and provisioning for their safety against such attacks
  • Using forensic analytics continuously to understand the cyber threat sources and their patterns through threat intelligence data
  • Policies to monitor all financial transactions through the mobile devices for understanding modus operandi

Cyber Security can no longer be tagged only to IT engineers in this digital era, especially where engineering organizations are embracing it in a big way. With th amalgamation of engineers from various branches to form the IoT teams, it has to be a collaborative effort to create ward-offs by both the core engineers as well as the IT engineers. Every solution must be scrutinized for a security threat and provisioning of the same- as part of each IoT solution. Penetration testing techniques would need more sophistication to weed out holes and at a much better pace.

There must be security norms laid out and each customer at all times must think and demand security wrappers around the solutions being doled out. …hate to say this but CYBER SECURITY CAN BECOME A NIGHTMARE if not taken care of!!

IoT Security is everybody’s business!! – Part 1

By Sairam Bollapragada

With the Digital wave, the structure of the IT organizations, especially those racing to embrace new technologies and IoT is poised for a paradigm shift. Every brilliant side of technological revolution comes with a darker patch as well. With so much of data slated to being generated via connected devices, the Cyber Security can no longer be the forte of IT folks ONLY.

While technology brings in convenience, it also comes at a cost (read flip side).

In the recent past in India, we have started seeing mobile wallets increasingly being used for payments and other financial transactions to another device or account. The connected wallets also create opportunities for hackers to break in and creatively lay their hands on the information pertaining to transactions, account details, the payee details, their numbers, the payment patterns, sources of funds, and many such confidential data which one would not like to divulge.

Cyber security, will don a new hat with the advent of new technology and devices working in tandem. Trying to stop break-ins will need a lot more intelligence and smart techniques to be devised. The provisioning of security to these mushrooming applications and connected devices will need to be really understood well so that people know they are secure while transacting with gates to personal data. The approach itself requires comprehensive techniques.

The mobile channels will provision more incentives with increase in volumes of both devices and transactions. The global reach of the mobiles have opened standard techniques for the hackers across the global hacking communities. Ubiquity and connectivity are vulnerable and enables folks to get to mobile devices. The incentives are more for mobiles which use financial transactions, undoubtedly. It may not be hard for hackers to know which user uses which number to carry out which financial transactions.

The richer the features of the mobile, the more it becomes a target for the hackers.  The concern about the privacy invasion by advertisers is rising steeply with these smarter devices. In 2010-11 Wall Street conducted a test for 101 Android/iOS applications and found that more than half sent device information, 47 shared location data, and 5% users –  personal information to advertisers without the consent of the users.

More than 1000 malware target mobile devices globally. An instance of worm attack can infect mobiles rapidly to the tune of millions of handsets.  As mobiles are getting more advanced so are the worms accomplishing more sophistication – raising their quality of attack as well.  As technology carriers are improving the device capability, the blue-tooth and Wi-Fi is also becoming airborne contaminators. Some viruses dial international numbers while the subscriber is sleeping.

The mobile computing increases the data loss as well. With the connected devices expected to transmit data across applications and other devices, the hackers would try means and ways to create opportunities in the chaos. Mobile banking has also brought in rogue applications which are smartly working their way to gather financial information from devices through even legitimate applications topped with these malware at app stores.

Over all this, it is said that more than 37% of the service providers do not have any threat intelligence programs.

Impacting Scenarios

As hackers take control of the connected devices, the very capability for which the IoT was brought in (efficiency, productivity, ease, etc) will be compromised.  It is scary to even think what if the folks are unable to stop machines, controlled by connected devices for convenience- large ones at that. IT security itself will not stand ground here.  The extended knowledge across applied industrial controls and production processes would become mandatory to put the checks and balances in place. (What if one is not able to stop a blast furnace in steel plants?…)

Water Management:  Anything which is scarce and essential comes under the cloud of threat and catches attention for disruptive opportunities. Water management through connected devices is becoming a lucrative offering from many vendors ensuring appropriate water quality, controlled water supply, water treatment, metering and other features. Water consumption, like electricity is also vulnerable where automatic vaults and control mechanisms for pressure and flow are devised to be controlled through technology. A loss of control would create wastage of water across and lead to a water crisis.

Patients Health Records (PHR)

The PHRs of patients are too personal a data to be privy to. These personal health records reveal several confidential parameters of personal health profile of an individual with historic ailments, health issues in the recent past, blood group info, and many more data which can lead to people either playing with or destroying the data for obvious reasons or holding the same for ransom. Very dangerous but true, not because we need to be scared, but the awareness of such a threat is missing till the first casualty occurs.

The Nuclear plants, used for positive reasons, like generating power can be a huge source of risk – if they were to lose hold over the control process of nuclear reactors.  If IoT based controllers were deployed in these plants for the purpose of analytics and other accompanying research advantages, there should be exhaustive sets of checks and audits built in – plus multiple approvals at multiple governance decision points to ensure disasters would be at least minimized.

Likewise, hacking connected or smart cars can lead to road disasters.  This includes the hacking of smart traffic management – feature of smart cities. Insurance transactions can be blocked and claims disabled or diverted, where insurance segments are moving from statistics to individual fact-based policies.

Cloud is another source of vulnerability. The plethora of data being stored on cloud will require tighter secured solutions, and hence the cloud data security will only become more crucial.

It is said that M2M communications will themselves generate about $900 billion in revenues by 2020.

Dependency on the connected devices for various aspects of the futuristic work-style like improved real-time decision making, better design of solutions, reliability on the so-generated data analytics (what about data quality?), driving future product conceptualization, fleet management,  and many others could be a challenge if the systems malfunction due to malware or cyber-attacks.

The above are potential scenarios where the flip side of technology, if misused, can create disasters and can cause unimaginable disruption. However, it is not too late to create a strategic security blueprint and get the awareness levels in the public embracing these newer emerging solutions in future.

We will discuss the potential next steps on what we should do, what the state agencies should do and what the general users should know in the sequel to this blog shortly. Till then happy reading….