How Secure is your API?

By Sairam Bollapragada & Sandeep Mehta

Technology will keep evolving and the existing platforms will keep transforming to make our solutioning richer with far more reaching impacts. API is the evolving technology glue which is promising strategic and much higher complex communications between the applications with many of the technologists vouching for them.

Just to strengthen the case and context here, Ovum Survey mentions that 30% of APIs are designed without the inputs from the infosec teams, 27% proceed with the development without security teams weighing the same, while 53% of IT/security professionals feel security teams should be responsible and 47% feel it should be developers. Of the API platforms used by companies (borrowed or in-house developer), only 22% had protections from 4 critical attack parameters – developer errors, web/mobile API hijack, automated scraping or malicious usage). 45% of the API Management platforms are infested by the rate limiting features. The list and arguments go on.

What are the threats to an API Platform

There are some jot-it-down parameters which should be hard-looked at whenever an API based solution design becomes a suspect.  Let us look at them and ponder on detections or workarounds.

  1. Unprotected APIs: API’s should have prioritized insulation built around them. REST, SOAP,  make access strictly controlled as back-end systems lack access control, management and monitoring. Exposed APIs must be dynamically scanned to ensure system exposures to unprotected assets via APIs are identified whenever there are requests made for access through the API layers.
  2. Hack-in Attempts: The attackers have high sustenance to break into the systems with spectrum of persisting techniques. Effective usage of rate-limiting services via API gateways to choke access requests and detect break-in patterns will help, upfront security testing and policy designs to block out users with patterns of malicious failed break-in attempts will be good strategy.
  3. Injections: High Impact attacking techniques like the SQL injections can become a most serious security failure with all your information compromised. One of them could be for output data written back to the API caller, is the source of data authentic and how is the encryption taking place? What is the extent of the data user control? Etc. This being a very vulnerable part, tools like SQP Map for testing the SQL injection, Burp Site for the same or even cross-site scripting are useful to prevent such threats.
  4. Strong Authentication!: APIs are designed to be exposed for external usage and hence every caller should be authenticated. The authentication cycle must be completely audited and checked from request initiation to termination using approved authentication standards. Application level testing to ascertain weakness of API approved authentication protocols would greatly assure validation of calling applications token.
  5. Session depravity: Not knowing who is the caller of API when the tokens are corrupted or cannot be authenticated will deem it impossible for API servers to differentiate between well-intended and ill-intended access. Tokens when tampered or replayed with altered privileges can create such a scenarios. Token-protection schemes like hashing and ensuring tokens are fresh using verified timestamp will help. A test suite developed to check token tampering is identified/tracked, or only accepted from authorized sources is mandated.
  6. TLS/SSL Protections: TLS/SSL Protection makes sure that data transferred between users/sites or between two systems becomes impossible to read. It uses  encryption algorithms to scramble data in transit.
  7. Trusted V/s Trustworthy: A trusted computer system is a system that is trusted to perform security- or safety-critical operations, a trustworthy system is a system which has already been trusted and secured using encryption methodologies.
  8. API Right Usage: An API should be used for what it is designed for. Many times the implementation on the API platform exceeds the functionality available on the platform. This exposes the whole platform to new set of risks. The limitations of an API platform have to be strongly kept into consideration whenever it is being evaluated for any solution. Also one fact to be kept in mind is that the API should peacefully Coexist
  9. Poor Code: Poor Code on an API exposes the platform to lot of vulnerability, examples of Poor Coding on API are not implementing certificate based authentication and not restricting IP addresses to filter only from known sources. This will expose the API platform to all external IP’s and anyone with basic skills can access the API and perform their operations.

Finally, whenever an API is being designed or being evaluated for usage, the base security parameters have to be sufficiently considered and evaluated. The evaluation or design of the product should always keep enough room to extend security features as and when required. There should also be enough facilities to implement remedial measures or alerts which will alert the users whenever there is any security threat or breach of the API.


Cognitive Blockchain?? An Agriculture sector perspective!

–          By Sairam Bollapragada and Rajesh Mohandas

In today’s hyper-connected world which is driven by hyper-dependent technology landscape, black swan effects are necessarily increasing as a result of Volatility, Uncertainty, Complexity and Ambiguity in the given globalization scenario. There is recognition that the demand (crisis?) in food security is only going up. Trust is becoming more a mandate than an option, especially for this sector that focus on “Farm to Fork” while already battling Drought, Emerging crop diseases, Pest resistance to chemicals and genetic traits, Phosphorous mines depleting, Salty soils, Fertilizer dependence and growing influence of anti-science forces.

A bigger threat faced by this $5 trillion sector representing 15% of global consumer spend, 40% employment and 30% greenhouse emission, is the “agro-terrorism” too. In the United States alone, crop and forest production losses from invasive insects and pathogens have been estimated at almost US$40 billion per year.

Hence the need for a “TRUST PROTOCOL” and a ledger of everything to establish transparency and traceability to the sources assisted by cognitive systems, to address the ever increasing concern of providing necessary quality and safe nutrition for a growing population. To accomplish the growing need, the food production should increase by 70% from the current levels (at near zero wastage) and one potential solution to the rescue could be the intelligence driven Blockchain (Cognitive Blockchain?)

This kind of Blockchain can create techniques for  upfront detection of malicious agents in the autonomous AI dominated supply chain. It brings in “Consensus” and “Persistent” algorithms rendering multi-agent Cognitive Connected Solutions into an evolving self-organized structure capable to overcome Data Oligopoly. Let us look at few sample applications:

  1. Transparency related issue resolution:
    1. Fund allocation by Government or private agencies to farmers in developing nations do not reach the intended farmers in time due to various reasons and leads to farmer’s bad debts. Usage of Blockchain based digital tokens as currency can come to the rescue of this scenario where bitcoin-enabled farmers can transact for fertilizers, seeds, and other necessities with least probabilities of being misused or imitated, ensuring the right usage of allocated funds creating maximum needed impact. Furthermore the AI Algorithms can add predictive features to enable the agencies to take proactive measures and plan for the future budgetary allocations.
    2. The above can also be used to stall the misuse of land documents by middleman or land-sharks standing in for the farmers. Blockchain can help create immutable land titles to prove the ownership and insure the farmers from the widespread corruption through the trust and transparency driven by Blockchain. The cognitive techniques can further aid in fighting fraud detection, money laundering, counterfeit currency, etc.
  1. Brokerage Avoidance:

High insurance premiums paid by farmers is a wide known issue, where majority of the play is by the middlemen. Cognitive Blockchain solutions, can help eliminate the middlemen and add the required intelligence where the CPQ algorithms will help farmers configure the insurance plans and opt the lowest premium for their crop produce, avoiding the brokerage/processing fee involved ensuring their premiums are actually paid in time.

In addition, such solutions will help insurance companies to better understand a farmer’s risk profile by predictive modeling, estimated profitability bettering their credit history and making them more creditworthy.

  1. Supply Chain to Demand Chain:

With AI, ML and analytics, the behavior of both the buyer and seller is seeing a change. Blockchain makes it even more interesting with Cognitive demand forecasting. As the appetite to handle more data goes up, the number of input sources and players participating in the demand forecasting will increase. With intelligent Blockchain solutions, one can not only keep track of the transactions and contractual obligations with suppliers and other stakeholders, but also gain visibility into the demand-supply scenarios. The farmers can run queuing algorithms and thus proactively route the supply to predictively meet the right demand in right time, making the shift to the supply chain as demand chain.

  1. The Food Security factor:

Forbes reports that a study conducted at WALLMART that took 6 days and 18+ hours to trace the source of a Mango carton, was completed in less than 3 seconds with Blockchain. Described by the Economist as “the trust machine”, blockchains can provide supply chain transparency and data integrity, allowing a visible assurance of authenticity and assist fighting food fraud, especially, the organic food fraudulent labeling that is becoming prevalent today.

If leveraged effectively, Cognitive Blockchain can play a very critical role in accomplishing the target of safe, quality nutrition needs of 9.6 billion population by 2050!!

VUCA in the Digital world!!

VUCA in the Digital world!!

By Sairam Bollapragada & Rajesh Mohandas

Across the globe, all are now connected in unprecedented ways. This is both a boon and a bane, where we live in an era that is transforming and setting stage for the next revolution. Times when we were disconnected and every country operated in silo the challenges were limited to the internal affairs and the near border conflicts only.

With technological advances where today we look at a bright and secured future on one hand, on the other hand the unrest continues and is growing bigger day by day, conflicts, civil unrest, terrorism, ransom ware, cyber crimes, etc… are now integrated into our daily life.

The digital reality is shaking up some of the beliefs and compelling us to move to a more knowledgeable IT economy what with automation and AI which were limited to books, have finally come to the open challenging how that can transform every space of the life. Soon all white-space is expected to be filled with cognitive behavior and techniques. Automation is forcing re-wiring of skills for many of the IT workforce (read : ) spelling end of the careers if not done.

Hence one can relate to the 4 key parameters of VUCA : Volatility, Uncertainty, Complexity and Ambiguity. Each of these factors are challenging the order of the day stuff and hence the need to cope with the same in the turbulent times.

The compounded problem statement with external influencing factors from market pressures, competition, shareholder expectations, stakeholders, are strong indicators, to the fact that the leaders will need to be hard wired to resilience.

The role of the leaders managing workforce, will be crucial and critical in shaping the digital future of any organization.  Most of the requirements to support a digital environment are not about the technology per se, but it is also about creating the environment to re-skill, create flexibility to be agile, adopt to changing demands, and groom the right talent for a safe digital future.

Let’s take each of the parameter at a time to see what it means in Digital world:

(V) Volatility: The nature and Dynamics of change that is blowing across the landscape mandates catalysts to adopt to these changes. The legacy of efficiency and productivity will no longer continued business anymore. Disruptive innovations are indeed unsettling dominant industries in today’s world. Hence the times call for compulsive innovation and a drift away from SOPs.

U (Uncertainity): This is a factor which reflects the lack of predictability and many surprises. Another indicator of this is the refusal of the current technology wave to move easily beyond the labs. The ever-experimenting mind-set is also reflecting that the solutions themselves are prone to obsolescence, from the very moment they are conceived with high degree of unpredictability.

( C)Complexity:  Multiple parameters built into the character of the issue spells complexity – be it chaos or confusion-led issues.

Complexity can also reflect multiple influencing factors which can unsettle easily. Complexity is good or bad depending on your strategy. Having a bullet proof strategy is impossible – nevertheless one should have a solid strategy to counter complexities and challenge the same.  Even if it comes with short expiry date (2 years) you should have one.

Digital space is getting more and more complex with each passing day rolling out a new platform, new innovations coming to light, new solutions offered, disruptive models coming to life, etc. Hence to deal with all these changes, a strategy for managing this change is mandatory and thus the

(A)Ambiguity : The fact that we only know 40% of how technology will fold into the lives and markets as an influencer, is a true reflection of haze in the Digital space. This then raises the question of business risk, which is quite a reality today.

At various levels of an organization, there are ambiguities relating to progression and growth, whether at organization level or career levels of professionals.  Except for the lexical meaning of the Strategic and Tactical approaches, the lines are thinning out.

Volatility, Uncertainty, Complexity and Ambiguity will continue to exist but what leaders today can do is to play a vital role and attempt to control the levers by moving in to a Hyperawareness zone of informed decision-making, and fast execution. Winning in the Digital Vortex is not just about algorithms, architectures or innovative business models; it requires organizational change and workforce transformation. And successful transformation is enabled by a company’s digital business agility, building on the fact that people are an organization’s most important asset. Hence, everybody is but compelled to think on the forward thinking strategies to adopt to the Digital VUCA scenarios….

The KM role in “Staying Relevant” in the Digital Age

By Sairam Bollapragada & Bhuvaneswari Valluri


A recent Google Trends chart shows a spurt of interest in digital transformation from May 2015 to taking precedence over mission critical activity that has been the trend in the last couple of years. India tops the charts with a 100% interest followed by Australia at 75% and United Kingdom with 51% interest rate. Given this focus, and considering that technology once “belonged” and has/is become(ing) “open”, organizations worldwide are finding it difficult to deal with the abundance of information and knowledge assets being churned out.

While much of the above is freely available on the internet, knowledge workers and subject matter experts within the organization are largely unable to capture their expertise and experiences quickly enough to proliferate across – a natural transmission loss hence.

Businesses and customers alike are constantly demanding change and challenging the status quo.

The desire is to leverage emerging and disruptive technologies providing a competitive edge and building uniqueness into products & services, and ensure faster growth. The technology space is evolving faster and quicker than our imagination (refer:

Organizational speed and agility remains, key! Improved productivity, streamlined delivery, and higher levels of customer satisfaction are a few demands. In the process, organizations are generating that much more knowledge to present the world with alternate solutions to a multitude of problems and needs. But, organizations cannot afford to be data rich with poor insight. Availability of right information and knowledge at the right time continues to be the need of the hour. Organizational memory refresh needs to be that much quicker.

Knowledge acquisition and its conversion to explicit knowledge still remains a challenge. We need to be get more structured around how we want to manage information. The new smart knowledge management system (SKMS) is supposedly a hybrid knowledge-based decision support system that takes information and sends it through four macro-processes: diagnosis(base or integration layer), prognosis(analysis layer), solution(solutioning layer), and knowledge(finds solutions to issues and presents alternatives based on past experiences), in order to build the Decisional DNA of an organization. The SKMS implements a model for transforming information into knowledge by using sets of experience knowledge structures by leveraging Communities of Practice.

Heavy focus on Centralized KM repositories is essential and must be kept current with an inflow of latest information while ensuring redundant and outdated information is weaned out regularly and with shorter lifecycles. KM processes for the capture, storage, sharing and archival of knowledge assets have to be that much more efficient, quick and effective. Organizations that have invested in KM practices are making headway by focusing on smarter knowledge management frameworks and adopting tools and mechanisms, SEO, improved usability, tagging content to ensure relevant and faster search results, mobile interfaces to ensure availability of knowledge while on the move, etc. are trending this year.

Employee  learning and unlearning curve becomes that much shorter and challenges managers to keep pace, stay relevant, make decisions based on critical factors that can also include — availability of training by experts (external and internal), and individual employee attention and memory span.

Microsoft’s Satya Nadella says, “We are moving from a world where computing power was scarce to a place where it now is almost limitless, and where the true scarce commodity is increasingly human attention”. Interestingly, Microsoft recently conducted a study on “what impact technology and today’s digital lives are having on attention spans.” Not very heartening to see that while the average human attention span was around 12 seconds in 2000 it has dwindled to 8 seconds in 2013 and this apparently is less than that of a goldfish’s attention span! Alarming in a way, considering that customer expectations are volatile and employees need to ensure they efficiently deliver services and products well ahead of time, keeping in mind competitive pricing and high quality.

Businesses worldwide are figuring out ways of ensuing a higher frequency of knowledge asset updation. Current research from HBR suggests that machine learning and computational linguistics are making a difference to organizations worldwide. Interesting examples of how an organization has used natural language processing to perform and learn time intensive data entry and documentation tasks; use computer vision to scan and analyse images; perform predictive maintenance etc. have been shared. This is good for organizations that have made conscious investment choices to stay current. But, is the writing on the wall clear enough for those who are still dealing with such issues?

Simpler ways to address this need have to be adopted. Exchange of tacit knowledge through communities, discussion boards, wikis and micro blogging must increase. Digital transformation Project and Delivery stories need to be shared by making this the KRA of each project manager.  Cross pollination of expertise knowledge via webinar, podcast and other modes needs to be mandated. Usability is the essence here and information architecture is prime. Organizations must invest on periodically revamping their taxonomies and metadata structures to ensure employees are equipped with right information at the right time to make them that much more capable.  Incentivization in non-monetary forms must be encouraged as this may address the WIFM (what’s in it for me!) for the employees. Periodic promotion of existing knowledge to increase KA usage should also be considered.

However, all this is not possible without proper governance. Following can help:

  • Knowledge assets’ (KA) review mechanism must be established through  domain knowledge experts teams
  • Customer confidentiality and non-disclosure agreements must be made more stringent…
  • Knowledge assets’ usage reports have to be automated.
  • Managers and decision makers must be able to access these reports and dashboards as required.
  • KA retention period and archival mechanisms must be established through a structured KM Strategy Plan.
  • Measures to ensure knowledge is constantly being made shareable should be mandated.
  • Demarcation on what is mandatory and bolt-on for teams should be established (how about a team knowledge strategy?)

In essence, what is required is a coherent and concerted effort by organizations to ensure they have the wherewithal in terms of the right set of knowledge assets enabled by effective KM processes that allows their employees to maintain high knowledge levels while challenging them consistently with improving and sharpening  learning curves and hopefully better than the goldfish’s attention span!