Today, we are spoilt for choice in almost everything so it does not come as a surprise that advanced technologies too offer a wide array–such as artificial intelligence/machine learning, analytics, 3D printing, augmented reality/virtual reality, cloud and IoT (internet of things). In truth, these are in nascent stages still (when compared to their real potential) and, in India particularly, firms remain undecided about where to hedge their bets. However at this point, AI is ahead of the curve and has avowedly staked its claim as the “new electricity”.
Now imagine a person who is afflicted by a life threatening disease and requires immediate medication. Right now in 2018 or even 2–3 years hence, what would be the probability of an Indian patient choosing to have their medicines 3-D printed, vis-à-vis buying them from an authorized pharmacist? Frankly, minimal. Again, this is not to imply that such a possibility will not become a reality in the distant future. Moreover, this example is selective and in many other instances, additive manufacturing may well be the preferred choice even today. In the New Health Economy, an environment influenced by tech, patients are seeking a greater role in their own care; increased pressure is being witnessed to reduce cost and clearly the physicians’ autonomy is on the decline. Besides the commonly cited example of telemedicine, AI has been successfully applied in niche areas where genomic data provide insights on the individual’s health, traits, life-style and drug responses. Patient treatment, including their experience, has been miraculous. Incredible successes abound in IoT applications in agriculture as well. Precision farming techniques, for instance, use field sensors to monitor farming operations and the impact can be felt right across the value chain: production, processing, storage, distribution and consumption. AI solutions can provide site-specific and timely data on crops to enable optimum use of fertilizers. We can continue to draw upon examples from other sectors and build a strong case for advanced technologies, as sci-fi enters our drawing rooms.
Naturally, all this is data-driven. So what stops cyber-criminals from using advanced technologies to mine this mountainous data and further intensify on the cyber-attacks which are already creating havoc? Frankly, nothing, and to get real, it’s being done already. In the past too, the darker elements of society have never shied away from putting technology to its unintended use with very harmful consequences. The new generations of malware attacks can be difficult to detect with conventional tools. That’s why dynamic approaches rely on ML used data to pre-empt attacks and respond effectively. As in most other fields, AI still cannot function on its own and at best can be used to augment cyber-security professionals and also free up their time. AI systems that directly handle threats on their own do so according to a standardized procedure so the element of human error can be removed. But again, if the degree of sophistication of the attack is high, an AI-driven response may not be altogether accurate as of now.
We’d do well to remind ourselves that hackers are well networked and share information in their circles. This posits an uncomfortable question: In our world, are we doing enough to share best practices and for fostering a collaborative mindset? Equally, the bad actors are incredibly fast in learning new technologies. So where exactly are we on re-skilling?
While such comparisons are not really desirable, since there are two races being run simultaneously, they are unavoidable. One race is being run at a frenetic pace aimed at human progress. The other one, which is no less intense, is about derailing human progress. Tools and techniques are available on both sides, and it will be the state of preparedness or the lack of it which will determine who stays ahead. And of course, to reiterate, that force multiplier called collaboration.livemint