alt =""

Monday, 7 December 2015

Disruptive Technology Weekly Roundup – Dec 1st to Dec 7th

The prevention, detection and response to cyber security in 2016 will view a sea of changes, says a new report from Forrester Research. According to Forrester, the five cybersecurity predictions and resulting actions to be taken in 2016 are as follows: In this disruptive technologies era, were wearables and IoT is expected to be more prevalent, the security and risk professionals should focus and reexamine the existing security functions in through a new angle. They should consider the human factor also while addressing the security threats. The second prediction is on Governments security capabilities. The research firm has given a bleak assessment of the security capabilities of US government, which is short staffed, under-budgeted and lacking internal discipline. The third prediction was about the expected increase of security and risk spending by 5 to 10 % in 2016. Fourth comes the defense contractors’ prospective entry to private industry with claims regarding ‘Military grade’ security. However, Forrester warns private players to thoroughly watch the commercial experience and their commitment before acquiring them. The fifth prediction covers the HR department that they will bring in identity and credit protection and resolution services as an employee benefit, in this era of increasing fraud, identity theft, medical identity theft and damage to personal online reputation. Read More:

As the holiday season is coming up, the cyber security researchers in the US warns about a malware, ModPOS, which is largely undetectable by current antivirus scans. The firm also points that the malware has infected even some of the national retailers. According to the researchers, it is one of the most sophisticated point-of-sale malware with a complex framework which is capable of collecting a lot of detailed information about a company, including payment information and personal log-in credentials of executives. To address the threat, the companies need to use more advanced forms of encryption to protect consumer data. Point-to-point encryption where a consumer’s payment card data is unlocked only after it reaches the payment processor is one such effective method to combat the malware threat. Security experts warn that without such protections, even new credit cards with a chip technology known as EMV could still be compromised by infected point-of-sale systems. Read More:

The information security landscape is continuously evolving, with the proliferation of disruptive technologies like mobile, social, cloud and big data have been increasingly impacting protection strategies. In-depth strategies to monitor, analyse and report security incidents is paramount to deliver an effective enterprise security risk management profile. Happiest Minds with our deep expertise in security arena along with a large pool of experienced security professionals brings in security solutions that address the key challenges faced by enterprises today. Our services aim to improve the agility, flexibility and cost effectiveness of the next generation needs of information security and compliance programs.

How Do You Solve a Problem Like Cyber Security?

Happiest Minds UK discusses the new-age deception technologies UK businesses should adopt to bolster theircyber-security defences
The recent TalkTalk cyber-security breach has brought the issue of security firmly back into the public’s psyche and has put both government and organisations on high alert. It seems that regardless of your vertical market, be it finance, technology or banking, the threat of a cyber breach is pretty much imminent. Only today I read an article which outlined that Britain’s Trident nuclear weapons system may be vulnerable to cyber-attack by a hostile state, according to former defence secretary Des Brown.
So, despite the UK being one of the highest EU spenders on IT security, existing cyber security solutions are simply not good enough to stop malicious hackers and evolving threats. It’s little wonder why Chancellor George Osborne has pledged to spend an additional £1.9 billion on cyber security and has committed to the creation of a ‘National Cyber Centre’ to respond to major attacks on Britain.
So, how do you solve a problem like cyber security? Well, the answer could well be to implement emerging deception technologies such as next-generation honeypots and decoy systems which, according to a new Gartner report entitled ‘Emerging Technology Analysis: Deception Techniques and Technologies Create Security Technology Business Opportunities’, could have a game changing impact on enterprise security strategies.
Deception technologies are effectively tools which deceive attackers and enable the identification and capture of malware at point of entry. They misdirect intruders and disrupt their activities at multiple points along the attack chain by luring them towards fake or non-existent data and away from the organisations critical data.
Let us look at a few of these technologies in greater detail:
Honeypots—or software emulations of an application or server—have been around for a few years now. A honeypot works by offering ‘honey’, something that appears attractive to an attacker, who will then expend his resources and time on gathering the honey. In the meanwhile, the honeypot does an admirable job of drawing his attention away from the actual data it seeks to protect.
Decoys are similar to honeypots and cause the attacker to pursue the wrong (fake) information. Many decoys act together to fill the attacker’s radar in a manner as to render it difficult for him to differentiate between real and fake targets.
However, organisations are now looking for more active defence strategies that not only lure in attackers, but also trap them, confound them and track their activity. One such deception technology offers an emulation engine masquerading as a run-of-the-mill operating system. The ‘operating system’ contains ‘sensitive’ data that could be attractive to attackers, for example data labelled ‘credit card info’. The platform will lure the attacker in by allowing him to ‘hack’ this fake data and in turn start gathering information about his movements and the codes that he seeks to modify. This intelligence can then be shared with other security tools, such as intrusion prevention systems, to defend against the attack.
A number of start-ups are designing various kinds of intrusion deception software that insert fake server files and URLs into applications. These traps are visible only to hackers and not normal users. An example of such a snare could be trapping hackers probing for random files by granting them access to bogus files that are a dead-end and merely keep leading them in circles towards more fake data. Or protecting the system against brute-force authentication by scrambling the attacker’s input so he can never get the password right, even if he does happen to type out the right code.
Other technologies set up fake IP addresses on webservers that, on multiple attempts to hack them, will always present a deception to that user. Other companies set up virtual systems or computers that actually have no data on them, and are indistinguishable from other machines on the network. Repeated intrusion into and unwarranted activity on these systems make it easy to identify hackers. The hackers’ movements and methods can then be analysed, and the data fed back into other threat detection solutions and tools.
Deception technologies therefore create baits or decoys that attract and deceive attackers, making it quicker for an organisation to detect a security breach. They increase the attacker’s workload and exhaust his resources. Certain solutions go beyond merely setting up decoys to also conduct forensic analysis on these attacks so the organisation can effectively defend its network and speedily mitigate security breaches. It may not be a ‘one size fits all’ answer to the cyber security conundrum, but it is certainly one more weapon in the organisation’s armory against hackers.

Wednesday, 25 November 2015

Taming the Elephant....Building a Big Data & Analytics Practice - Part I

A couple of decades ago, the data and information management landscape was significantly different. Though the core concepts of Analytics, in a large sense, has not changed dramatically, adoption and the ease of analytical model development has taken a paradigm shift in recent years. Traditional Analytics adoption has grown exponentially and Big Data Analytics needs additional and newer skills.

For further elaboration, we need to go back in time and look at the journey of data. Before 1950, most of the data and information was stored in file based systems (after the discovery and use of punched cards earlier). Around 1960, Database Management Systems (DBMS) became a reality with the introduction of hierarchical database system like IBM Information Management and thereafter the network database system like Raima Database Manager (RDM). Then came Dr. Codds Normal Forms and the Relational Model. Small scale relational databases (mostly single user initially) like DBase, Access and FoxPro started gaining popularity.

With System R from IBM (later becoming the widely used Structured Query Language  database from which IBM DB2 was created), and ACID (Atomicity, Consistency, Isolation, Durability) compliant Ingres Databases getting released, commercialization of multi-user RDBMS became a reality with Oracle and Sybase (now acquired by SAP) databases coming into use in the coming years. Microsoft had licensed Sybase on OS2 as SQL Server and later split with Sybase to continue on the Windows OS platform. The open source movement however continued with PostGreSQL (an Object-Relational DBMS) and MySQL (now acquired by Oracle) being released around mid 1990's. For over 2 decades, RDBMS and SQL grew to become a standard for enterprises to store and manage their data.

From 1980's, Data Warehousing systems started to evolve to store historical information to separate the overhead of Reporting and MIS from OLTP systems. With Bill Inmon's CIF model and later Ralph Kimball's popular OLAP supporting Dimensional Model (Denormalized Star & Snowflake schema) gaining popularity, metadata driven ETL & Business Intelligence tools started gaining traction, while database product strategy promoted the then lesser used ELT approach and other in-database capabilities like in-database data mining that was released in Oracle 10g.  For DWBI products and solutions, storing and managing metadata in the most efficient manner proved to be the differentiator. Data Modeling Tools started to gain importance beyond desktop and web application development. Business Rules Management technologies like ILOG JRules, FICO Blaze Advisor or Pega started to integrate with DWBI applications.

Big Data Analytics, Business Intelligence,

Once the Data Warehouses started maturing, the need for Data Quality initiatives started to rise, since most Data Warehousing development cycles would have used a subset of the production data (at times obfuscated / masked) during development and hence even if the implementation approach would have included Data Cleansing and Standardization, core DQ issues would start to emerge post production release to even at times render the warehouse unusable till the DQ issues were resolved.

Multi-domain Master Data Management (Both Operational or Analytical) / Data Governance projects started to grow in demand once organizations started to view Data as an Enterprise Asset for enabling a single version of truth to help increase business efficiency and also for both internal and at times external data monetization.  OLAP integrated with BI to provide Ad-hoc reporting besides being popular for what-if modeling and analysis in EPM / CPM implementations (Cognos TM1, Hyperion Essbase, etc.)

Analytics was primarily implemented by practitioners using SAS (1976) and SPSS (1968) for Descriptive and Predictive Analytics in a production environment and ILOG (1987) CPLEX, ARENA (2000) for Prescriptive Modeling including Optimization and Simulation. While SAS had programming components within Base SAS, SAS STAT and SAS Graph, the strategy evolved to move SAS towards a UI based modeling platform with Enterprise Miner and Enterprise Guide getting launched, products that were similar to SPSS Statistics and Clementine (later IBM PASW modeler) which were essentially UI based drag-drop-configure analytics model development software for practitioners usually having a background in Mathematics, Statistics, Economics, Operations Research, Marketing Research or Business Management. Models used sample representative data and a reduced set of factors / attributes and hence performance was not an issue till then.

Around mid of last decade, if anyone had knowledge and experience with Oracle, ERWin, Informatica and MicroStrategy or competing technologies, they could play the role of a DWBI  Technology Lead or even as an Information Architect with additional exposure & experience on designing Non Functional DW requirements including scaleability, best practices, security, etc.

Sooner, the enterprise data warehouses, now needing to store years of data, often without an archival strategy, started to grow exponentially in size. Even with optimized databases and queries, there was a drop in performance. Then came Appliances or balanced / optimized data warehouses. These were optimized database software often coupled with the operating system and custom hardware. However most appliances were only supporting vertical scaling. However, the benefits that appliances brought were rapid accessibility, rapid deployment, high availability, fault tolerance and security.
Appliances thus became the next big thing with Agile Data Warehouse migration projects being undertaken to move from RDBMS like Oracle, DB2, SQL Server to query optimized DW Appliances like Teradata,  Netezza, GreenPlum, etc. incorporating capabilities like data compression, massive parallel processing (shared nothing architecture), apart from other features. HP Vertica, which took the appliance route initially, later reverted to become a software only solution.

Initially Parallel Processing had 3 basic architectures – MPP, SMP and NUMA. MPP stands for Massive Parallel Processing, and is the most commonly implemented architecture for query intensive systems. SMP stands for Symmetric Multiprocessing and had a Shared Everything (including shared disk) Architecture while NUMA stands for Non Uniform Memory Architecture which is essentially a combination of SMP and MPP. Over a period of time, the architectures definitions became more amorphous as products kept on improvising their offerings.

While Industry and Cross-Industry packaged DWBI & Analytics Solutions became increasingly a Product and SI / Solution Partner  Strategy, end of last decade started to see increasing adoption of Open Source ETL, BI and Analytics  technologies like Talend, Pentaho, R Library, etc. adopted within industries (with the only exceptions of Pharma & Life Science and BFSI Industry groups / sectors), and in organizations where essential features and functionality were sufficient to justify the ROI on DWBI initiatives that were usually undertaken for strategic requirements and not for day to day operational intelligence or for  insight driven additional or new revenue generation.

Also, cloud based platforms and  solutions adoption and even DWBI and Analytics application development on  private or public cloud platforms like Amazon, Azure, etc. (IBM has now come with BlueMix and DashDB as an alternate) started to grow as part of either a start-up strategy or cost optimization initiative of Small and Medium Businesses and even in some large enterprises as an exploratory initiative, given confidence on data security.

Visualization Software also started to emerge and carve a niche, growing in increasing relevance mostly as a complementary solution to the IT dependent Enterprise Reporting Platforms. The Visualization products were business driven, unlike technology forward enterprise BI platforms that could also provide self-service, mobile dashboards, write-back, collaboration, etc. but had multiple components with complex integration and pricing at times.

Hence while traditional enterprise BI platforms had a data driven "Bottom Up" product strategy, with dependence and control with the IT team, Visualization Software took a business driven "Top Down" Product Strategy, empowering business users to analyze data on their own and create their own dashboards with minimal or no support from the IT department.

With capabilities like geospatial visualization, in-memory analytics, data blending, etc. visualization software like Tableau is increasingly growing in acceptance. Some others have blended Visualization with out-of-box Analytics like TIBCO Spotfire and in recent years SAS Visual Analytics, a capability which otherwise is achieved in Visualization tools mostly by integrating with R.

All of the above was manageable with reasonable flexibility and continuity till data was more or less structured and ECM tools were used to take care of documents and EAI technologies were used mostly for real-time integration and complex event processing between Applications / Transactional Systems.

But a few year ago, Digital platforms including Social, Mobile and other platforms like IOT/M2M started to grow in relevance and Big Data Analytics grew beyond being POCs undertaken as an experiment to thereafter complement an enterprise data warehouse (along with enterprise search capabilities), to at times even replace them. The data explosion gave rise to the 3 V dilemma of velocity, volume and variety and now data was available in all possible forms and in newer formats like JSON, BSON, etc. which had to be stored and transformed real-time.

Analytics had to be now done over millions of data in motion unlike the traditional end of day analytics over data at rest. Business Intelligence including Reporting and Monitoring, Alerts and even Visualization had to become real-time. Even the consumption of analytics models now needed to be real-time as in the case of  customer recommendations and personalization, trying to leverage smallest windows of opportunity to up-sell / cross-sell to customers.

It is Artificial Intelligence systems, powered by Big Data that is becoming the game changer in the near future and it is Google, IBM and others like Honda who are leading the way in this direction.
To be continued.........

5 Ways to Secure the Public Cloud

As cloud computing becomes more sophisticated and mainstream, the shift to the public cloud is gaining tremendous traction. With big-brand clouds (Amazon Web Services, Google Cloud Platform and Microsoft Azure) fast evolving, more and more enterprises are moving away from private clouds. However security is justifiably a top concern when moving applications and data into the public cloud. Some of the questions foremost on everyone’s mind are - How secure is my data? What will happen is there is a breach with the public cloud vendor? How do I ensure that my data is properly protected in this case?

Security is ultimately a shared responsibility between the company and the public cloud vendor.  According to Forrester, cloud success comes from mastering the “uneven handshake”. While cloud vendors are typically responsible for securing the data center, infrastructure and hypervisor, the onus is on you, as a consumer to close this gap with the necessary OS, users, applications, data and of course, security – in tandem with the vendor.

Journeying to the Public Cloud

The key is to find a cloud provider that fits best for your business. This means you need to thoroughly vet potential vendors and conduct a full risk assessment prior to signing any contract. Considering the fact that different cloud service providers provide varying levels of security, it is best to look at their security and compliance activities and choose one with transparent processes. Once this decision has been made, the next step is to take into account the various security risks and chart possible solutions to create a secure cloud environment.

Here are 5 steps to best protect data in the public cloud:

Intelligent Encryption

Encryption is a viral security component of any organization and it is all the more important when transferring and storing sensitive data in the cloud. It ensures data confidentiality thus mitigating the risk of data loss or theft in the case of a breach in the cloud. This focus on the data itself rather than placing full emphasis on the infrastructure for protection goes a long way in ensuring that data stays safe even if the network or perimeter security is compromised.
security and compliance
Strict Identity Management and Access Control

An effective identity management strategy for the cloud can be summed under the three ‘As’ – access, authentication and authorization. Consumers must ensure that only trusted and authorized users can access the public cloud data through a strong identity management system. Additional layers of authentication measures further help in ensuring a controlled cloud environment. An important note here is to find a good balance between security and developer performance.

Smart Security at All End-points

In most cases, physical security is usually covered by the cloud provider through regular audits and certifications from accreditation bodies. In certain industries like healthcare, finance and defense, it is a regulatory mandate that there be security at all points along the data path – be it entering or exiting the corporate network or moving along to the cloud and in the cloud itself. However as a general trend in today’s cloud and BYOD era, it is of utmost importance that the consumer ensures some hardware necessities and best practices for end-point security in addition to the cloud security measures. Mobile devices in particular pose a unique challenge as despite best intentions, users generally do not prioritize securing them. Unfortunately, this results in exposing potential access points to sensitive corporate data. Strong end-point measures typically should encompass mobile/on-device protection, next generation firewalls, network intrusion systems, VPN and up-to-data security architectures.

Real-time Monitoring & Incident Response

As part of the shift to a “prevent and control attack” mindset, real-time monitoring through analytics and forensics enables consumers to identify attacks early in the breach lifecycle. Instant alerts and automatic data collection through analytics enables rapid forensics and insights into behavior from endpoint to the cloud. Armed with these insights, security team can identify potential risks and patterns in real-time, while also determining the path for on immediate remediation. Organizations should also focus on enterprise level visibility for hosted applications in the cloud in conjunction with the cloud provider, thus providing a multi-pronged approach for quick detection and incident response for security issues.

Strong Governance Framework

A governance framework is an essential tool that will enable your IT security team to assess and manage all risks, security and compliance related to the organization’s cloud environment. This crux of this framework is that it needs a synergy between security, IT, business and the organization itself for a secure cloud. A strong framework typically encompasses stringent security policies, audit compliance, identity management, security control tools, a BYOD policy and a contingency plan. But to ensure true compliance with cloud policies, organizations have to work closely with IT security teams to understand the unique challenges of cloud security and ways to protect sensitive data workloads. Additionally, educating and training users to comply with the organization’s cloud policies can go a long way in achieving compliance.

Cloud computing is revolutionizing the way enterprises operate in today’s world with a slew of cost benefits and tremendous economies of scale. As with any other investment, it is your responsibility to ensure that cloud is protected as much as possible. With a robust set of security processes, tools, a clear BYOD-compatible cloud computing strategy and a strong governance framework in place, there is a significant reduction in risk as you embark into the cloud. And the future is yours as long as your organization continuously adapts to stay agile and competitive in a fast evolving cloud technology landscape.

Cyber Threat Intelligence – What is needed?

Cyber Threat Intelligence (CTI) is a term used to address any kind of information that protects your organization’s IT assets from potential security impeachment. CTI can take many forms. It could be internet based IP addresses or geo locations TTP’s (Tools, Tactics and Practices). These work as indicators or early warnings of attacks which can take a toll on an enterprise’s IT infrastructure. There are numerous vendors across the globe whose CTI can be seamlessly made part of security interfaces like GRC tools, SIEM and other correlation engines. That being said, what information can be employed to generate actionable CTI to defend your enterprise security? Let’s look at the same in detail:
Drivers may vary anything from attacks like a ‘zero day’, business related breaking news, or certain announcements that cause vulnerabilities in the enterprise’s activities. Understanding the nature of the drivers can help increase the security vigilance.

This accounts for everything an attacker would need to trigger an attack on your IT infrastructure through intranet perimeter, network, endpoints and just about anything that is exposed to internet.
The script Kidde’s could generate an attack but may not possess the capacity of post-attack activities. Or a professional attacker could have the capabilities of penetrating an attack but its defense mechanism may not be able to stop provide the attacker with intended results. Understanding the capabilities of the attacks and the attackers in absolute length can help defend security to a great extent.
Another element to considered to better equip security concerns is keeping an account of the attacking component’s tools, tactics and procedures that were used in the past attacks conducted by the attacker. This would help generate indicators to better prepare for the forthcoming attacks.
Measurement is important to determine the impact of the attack, mostly in terms of number and types of security events which are generated during the pre-attack condition. The more ways we can interpret different natures and depths of these measurements, the more the security interface can work on the counter-attack measures and recovery processes.
There are many security dimensions that when considered carefully can help avoid, tackle, monitor and help recovery of a security impeachment. While the aforementioned are a hand few, the list can get a lot longer to include threat vectors, compromise parameters, defense mechanism techniques, business impact analytics, attack patterns from the past, zero day detection, security control bypassing, post compromise information, etc.. The more we include these factors, the better IT security vigilance gets.

Cyber Threat Intelligence – What is needed?

Cyber Threat Intelligence (CTI) is a term used to address any kind of information that protects your organization’s IT assets from potential security impeachment. CTI can take many forms. It could be internet based IP addresses or geo locations TTP’s (Tools, Tactics and Practices). These work as indicators or early warnings of attacks which can take a toll on an enterprise’s IT infrastructure. There are numerous vendors across the globe whose CTI can be seamlessly made part of security interfaces like GRC tools, SIEM and other correlation engines. That being said, what information can be employed to generate actionable CTI to defend your enterprise security? Let’s look at the same in detail:
Drivers may vary anything from attacks like a ‘zero day’, business related breaking news, or certain announcements that cause vulnerabilities in the enterprise’s activities. Understanding the nature of the drivers can help increase the security vigilance.

This accounts for everything an attacker would need to trigger an attack on your IT infrastructure through intranet perimeter, network, endpoints and just about anything that is exposed to internet.
The script Kidde’s could generate an attack but may not possess the capacity of post-attack activities. Or a professional attacker could have the capabilities of penetrating an attack but its defense mechanism may not be able to stop provide the attacker with intended results. Understanding the capabilities of the attacks and the attackers in absolute length can help defend security to a great extent.
Another element to considered to better equip security concerns is keeping an account of the attacking component’s tools, tactics and procedures that were used in the past attacks conducted by the attacker. This would help generate indicators to better prepare for the forthcoming attacks.
Measurement is important to determine the impact of the attack, mostly in terms of number and types of security events which are generated during the pre-attack condition. The more ways we can interpret different natures and depths of these measurements, the more the security interface can work on the counter-attack measures and recovery processes.
There are many security dimensions that when considered carefully can help avoid, tackle, monitor and help recovery of a security impeachment. While the aforementioned are a hand few, the list can get a lot longer to include threat vectors, compromise parameters, defense mechanism techniques, business impact analytics, attack patterns from the past, zero day detection, security control bypassing, post compromise information, etc.. The more we include these factors, the better IT security vigilance gets.

Wednesday, 18 November 2015

Store as Fulfillment Center: Omnichannel and the Future of Retail

Omnichannel has come of age for brick-and-mortar retailers.
Traditional retailers have been on a slow yet steady adoption of digital technologies over the last two decades. First arrived e-commerce, which retailers took on as another channel for customer acquisition and sales. Coupled with this emerged online-only players opening up new avenues of fulfillment. Then came smartphones, setting a new paradigm of customer experiences.
Today, with the faster evolution of technology and ever-increasing consumerization, there is a demand for ultimate flexibility and innovation. Customers expect to be recognized and pampered, and they switch loyalty for the smallest of added perceived value – be it monetary based, convenience based, or experience based.
Brick-and-mortar retailers with an established national and/or international store network are specifically suited to meet the customers of today where they are – online, on mobile, in a physical store, or even in a subway station.  These phy-digital retailers can and must strive for true omnichannel – seamless, connected, and personalized experiences irrespective of how and where their customers shop.
 Omnichannel and the Future of Retail
The Potential of a Store
Despite the increasing adoption of digital shopping, it remains a fact that, for bricks-and-clicks retailers, over 90 percent of revenues are from their physical stores and the store, therefore, continues to be nerve center of operations. It is important to realize the true potential of the huge store network for such retailers.
Stores can transform to be experience centers for omnichannel customers. Here are a few solutions that can bring transformational experiences in-store:
  • Experiential kiosks and digital displays
  •  Digital signage
  •  In-store IoT/ beacon-based personalized experiences
  •  Customer engagement driven by data insights
Stores can be mini-fulfilment hubs, offering ultimate flexibility when it comes to delivery choices and saving a potentially lost sale. Examples of such initiatives include the following:
  • Order online to pick up in store or at curb side, fulfilled from store or warehouse
  •  Order in-store for home delivery, from a warehouse, same store, or another store
  •  Order in-store for pick up from store, from same store or another store
When armed with right tools and technologies, store associates can be brand ambassadors, driving customer loyalty and improving customer retention. For example, when a store associate is asked a question about a salmon pink shirt that was found online but is not in stock in store, the store associate should be incentivized and have the tools to check inventories of nearby stores or the distribution center. Further, the associate should be empowered to take the order for shipping this product to customer’s home at no extra charge the next day.
It’s a no-brainer that omnichannel retailers must invest in technologies that deliver the data to drive store-transformation initiatives.
Implications for Brick-and-Mortar Retailers
For a complete omnichannel transformation to be successful over next two to three years, the foundation has to be strong. It starts with a data-driven, single view of the customer, orders, inventory, products, etc. and a scalable architecture to support dynamic changes in business.
  • To enable an endless aisle of products not limited to a store’s physical space, a global product catalog should be available across channels, including your extended supply network and drop-ship vendors.
  •  To enable stores to be fulfilment hubs, a real-time and reliable view of inventory data should be available across the entire supply network.
  •  And for personalization to click, a 360-degree view of customers’ online orders, store transactions, social engagement, lifetime value, loyalty history including open orders, queries, and complaints is a must.
Orchestrate transformational customer journeys. Decoding retail customer journeys is the starting point to digital transformation. In the era of design thinking and customer experience, a new paradigm of solution design is evolving. Yes, there are beacons, there is big data, there is fast data, there are mobile technologies and cloud applications that promise Nirvana. However, to get transformational business outcomes, there is a need for careful curation of experiences.
Bricks-and-clicks retailers must orchestrate an end-to-end experience that is beyond a pointed technology solution to solve a particular problem like knowing what the customer did on the website or what she purchased in a store. It is about bringing all the insights and business states about products, customers, and even assets like dressing rooms to curate a new digital journey for the customer in-store.
Empower store associates. Retailers must realize the importance of their associates as omnichannel evangelists who can make or break seamless experiences for the customer. Initiatives to incentivize cross-channel “save the sale” behavior is one key paradigm shift that retailers must consciously undergo.
The store associate must be equipped with data on products available across different distribution channels and, to be credible brand advocates, also must be as knowledgeable as her customer. She needs the right technology to have access to meaningful insights on her customer in order to offer a personalized experience. Tools and technologies that can provide data that deliver in-the-moment, 360-degree views on customers, enterprise-level inventory data, mobile point of sale, and in-built intelligence to provide the right recommendations (product recommendations, substitutes, alternate fulfillment options, dynamic offers) are critical for associate empowerment.
The benefits of executing well on all the above initiatives are increased footfalls, increased conversions with a multiplier effect across channels and, most importantly, increased customer loyalty and retention.

Why Retailers Should Recruit a Chief Omnichannel Officer Now

Thanks to modern technology and digital tools, the opportunities to interact with and buy from a brand today are ubiquitous. Customers want to shop anytime, anywhere. Omnichannel rules, and smart retailers are getting on board.
For the customer, the best of omnichannel creates a consistent and uniform experience across all touch-points — online, brick-and-mortar stores, social media, events, mobile and more — all the time. For the retailer, omnichannel reaches its pinnacle of effectiveness when each channel’s operations are connected at the back end and continuously provide integrated, customer-specific information coming into the organization. This highly valuable data can then be analyzed and acted upon, to build a sound strategy for new — and even more consistent — marketing and sales efforts going forward.
Transforming a multichannel entity into a true omnichannel organization is much easier said than done. It is a job that requires a dedicated, totally focused individual that has the responsibility — and seniority — to integrate multichannel systems (literally and figuratively) across all customer touch points: store operations, marketing, call center, and digital (which includes all forms of non-store-based commerce). This is made all the more difficult because traditionally — and naturally — most of today’s organizational structures have evolved into fairly ingrained silos.

A chief omnichannel officer can help a retailer go from silos to seamless. Here are the specific responsibilities the officer should tackle:
Eliminate silos
Customer touch points today usually exist in the store as point-of-sale systems, online as e-commerce systems and on-the-go as m-commerce platforms, the contact center, and other systems. Up to now, sales and other information has been collected and stored right back within the different system silos.
Retailers still getting used to multichannel efforts have traditionally kept channels independent of one another. This approach is fine, but does it really provide a true picture of how r customer interacts with a brand all the time? A savvy chief omnichannel officer will eliminate silos and integrate all channels at the back end to then take the next step: making the most of data that is generated by the customer.
Get the most out of customer information
To turn customer data into real information assets in aggregate, a central repository that can syndicate useful product information back out to the various channels must be created, and that is one big job. Today, disparate CRM systems are left struggling to get a single, consistent view of the customer. Customer information is one of the most valuable of assets in retail but it in a multi-siloed organization this data is rarely utilized properly.
The lifeline of a truly effective omnichannel experience is data that is integrated in terms of every customer data touch point, and that means integrating existing systems without minimizing each systems’ effectiveness, which is a tricky IT challenge that should be up front and center to a chief omnichannel officer.
Get staff on the same (omnichannel) page
Technical problems apart, siloed skills among staff create their own issues. Disconnects exist between a retailer’s business and technical staff. Open conversations that focus on people, processes and technology are rare between the chief marketing officer and CIO.
Separate heads for all functions — marketing, finance, merchandising, HR, stores, etc. — all report to the CEO or president. As a result, very few people have a holistic understanding of the business, much less what it takes to create an omnichannel presence. What’s more, most high-level, C-suite executives are too tied up with other business issues to commit to the kind of focus necessary to drive the creation of a functioning omnichannel organization.
A key responsibility of an omnichannel officer should be to drive — from a senior level — a commitment to omnichannel throughout the organization, oversee accountability in that commitment, and ingrain omnichannel into the company culture. Change is hard, but breaking silos to achieve synchronization, alignment and ownership among staff is paramount.
The omnichannel chief must encourage active involvement, monitoring, facilitation and support from channel leaders. To do this and communicate effectively with function heads, the officer must have an understanding of all customer touch points, the organization’s holistic business needs, and a direct reporting status to top leadership.
Be interested in revenue generation
Transformation into an omnichannel organization might come faster if, besides managing the development of strategies that integrate the company’s systems, people and activities, the chief omnichannel officer takes on somewhat of a P&L role.
When recruiting for the position, discuss the possibility of responsibility for revenue generation activities along with a reasonable share of the profitability. In the ideal scenario, the chief omnichannel officer will look after the execution of omnichannel and will also be responsible for the ROI on marketing investment. In that way, he or she can inseminate an organic acceptance of omnichannel best practices across all departments, while at the same time encouraging digital growth in such a way that it doesn’t affect current high-performing channels.
No doubt, the idea candidate needs to be one talented and well-rounded individual. Someone with strong digital marketing experience and exposure to other key business functions is a good place to start, and should enable the individual to grow into the role properly in a short period of time.
Simply put, transforming an organization into an omnichannel powerhouse is an exercise in managing change. Placing the right person in charge near the top of your organization will make it clear to all that it is an initiative to be taken seriously. Setting the right tone with all stakeholders will speed sincere acceptance and motivate everyone to deliver. If a retailer can achieve this, the company is on its way to converting your investment in omnichannel into tangible long-term results and strategic market advantage.
Salil Godika is co-founder of Happiest Minds, a next generation digital transformation, infrastructure, security and product engineering services company. With 19 years of experience in the IT industry across global product and services companies, he previously was with Mindtree for four years as the chief strategy officer/M&A and held P&L responsibility of an Industry group. Before Mindtree, Salil gained 12 years’ experience in the United States working for various software product companies large as well as start-ups.

Friday, 6 November 2015

Web Summit 2015: The 3 Day Tech Event Wraps up

Amidst the incessant rain, the amazing Tech event- Web Summit 2015 was wrapped up on 5th of November at Dublin. The 22000 tech enthusiasts including me are returning back to our home countries with mindful memories of insightful tech talks and the networks that we built during the last three days. Web Summit is shedding its Irish roots and will be moving to Lisbon next year. Yet another significant announcement by the organizers was that 10000 female Entrepreneurs will be given free tickets in 2016 across conferences in US, HK, India, Lisbon. The Indian chapter of the conference is called Surge and is scheduled to be held at Bangalore on February 23- 24, 2016.


Coming back to the third day tech talks. The 3D Printing technology, Virtual reality and smart umbrellas took the center stage, on Day 3. Multiple vendors have showcased the 3D printing technology in the booths. An interesting start up “Oombrella” showcased their product of the same name, the first smart & connected umbrella, which falls on a kick starter and collects weather data and sends real-time alerts for the users. The umbrella is connected to a smartphone app. The app gives rain alerts for the day, and if you accidentally forget your umbrella behind, the smartphone will begin buzzing with a message from the umbrella with its GPS location. The “Myo Armbands” from Thalmic labs also caught audience attention. Myo works on an intelligent combination of motion sensing and muscular activity and enables muscle activity in the arm to control a range of devices.


The best startup of 2015 was awarded to Connectera, an Amsterdam-based startup’s product which can analyze real-time movement data of cattle to help improve farm productivity. Bizimply, a cloud-based software-as-a-service business that provides workforce management software to businesses with hourly paid employees, also got the startup award at the Web Summit 2015.

Ed Catmull, president of Pixar and Disney Animation Studios and the New York Times best-selling author Dan Brown were the two personalities who got greater audience attention on the closing day. Catmull closed the Web Summit to an absolutely jam-packed RDS main hall.

In a nut shell, with more than 22000 tech enthusiasts from 134 countries, 21 different mini-summits, 1,000 speakers, 2,141 start-ups and 1,000 investors, the greatest tech festival on the planet marks its end here in Dublin. I am bringing back many invaluable contacts and networks worth building future business opportunities to Happiest Minds, the next generation Digital Transformation Company which I represent. A big bye to all my new friends and the bustling RDS venue, Dublin. A big “Thank You” to the Web Summit organizers for giving me this wonderful opportunity, to be a part of Web Summit 2015.

Thursday, 5 November 2015

Web Summit 2015: The Tech World Musings From Dublin

Grown from the 400 attendees five years back to the current 22000 tech enthusiasts, the Web Summit 2015 continues to deliver innovative ideas and fascinating thoughts to the tech world gathered at the bustling RDS venue, Dublin.

Cars and technology took the center stage on day 2. Augmented reality, Virtual Reality, Drones, and Wearables were also some of the key highlight topics that seized the audience attention on day 2. Check out some of the most interesting tech talks from day 2 at Dublin. Ford chief executive Bill Ford, pointed out the promising intersection between cars – an industry that has been “revolution- resistant for a hundred years” - and technology. He added that Ford is redefining itself as a “mobility company” with an interest in autonomous driving, net-connected cars along with data collection and analytics. Sean Rad of Tinder, the CEO of location- based dating app highlighted about the data that drives it and the future of the platform. On a lighter note, he added that the Irish user base was extremely active on the app.

Web summit

The most exciting part of day 2 was the live demonstration of a drone that flew into the center stage by Randy Braun of DJI- a World Leader in Camera Drones/Quadcopters. The tech enthusiasts including me curiously heard that DJI along with Humanitarian UAV Network uses their drones or Unmanned Aerial Vehicles UAV’s for a wide range of humanitarian and development settings. Google showcased its famous virtual reality platform ‘Google Cardboard’, basically a cardboard case for smartphones that works in conjunction with compatible apps for projecting 3D images or videos. The wonders of the virtual reality did not ended with Google’s Cardboard. Columbian company Protesis Avanzadas showcased a 3D robotic prosthetic hand, an affordable multifunctional prosthetic hand that can replicate many of the grip patterns of the human hand, in the summit center stage. The Head of Adtech at Facebook, Dave Jakubowski took to the Marketing Summit- the state of the industry, FOMO (Fear of Missing Out) in the digital age.

All the tech talks surrounding Virtual Reality, Augmented Reality and Machine Learning reminds us that we are swiftly moving into an age of transformation, where the bridge between the digital and the real world slowly blurs out. All these technology advancements also hold the great potential to redefine the existing business models. As a part of a digital transformation company Happiest Minds, which is strongly focused on the new age disrupting technologies including IoT, Big Data, M2M Learning, Cloud and Mobility, I strongly feel that very interesting days are coming ahead in terms of technology as well as the customer experience.

Anticipating more exciting and insightful talks and demos from the Web Summit 2015 stage, on the closing day, 5th of November. Stay tuned.

Wednesday, 4 November 2015

5 Drivers for Securing The Internet of Things

If you have any doubt at all about the impact of the IoT, consider these facts: 75 percent of the world’s population has access to a mobile device. When you compare the number of connected devices in 2009 (0.9 billion) to the number today, it represents a 30-fold increase. It is estimated that over 26 billion devices will be connected to the internet by 2020.

Along with the massive growth of IoT is the growth of corresponding security issues. As connected devices increase, so does the amount of data generated and transferred by these devices. As more data is transferred, the number of pathways and parameters for the cyber criminal to exploit also increases. It all adds up to the need for more protection than ever before.
Internet of Things, IoT, IT security, IT security strategy, CISO,
Vital role of the CISO

As the world of IT security transforms to meet this exponential growth, the role of the CISO becomes vital in terms of defining the IT security strategy.

Before IoT, the IT and Operational Technology (OT) layer were controlled and secured differently; IT security focused on the confidentiality of data and network infiltration, while OT security emphasized physical security, safety and business continuity. Now that more devices are connected to the internet, the OT layer has become increasingly IP enabled, making it more vulnerable. Traditional security models must adapt, and the CISO must create a unified IT security strategy.

Attention to the following key drivers will assist the smart CISO with devising a strategy that truly works in securing the IoT:

1. Layer visibility. The OT layer, the IT layer and any other layers of the network should have visibility and be encompassed by an overall, unified security plan of action. No layer or device should be exempt.

2. Threat visibility. New devices mean new loopholes and threat vectors. A sound strategy should take into account not only existing vulnerability, but potential vulnerability, as soon as a device is connected to the network. A real-time threat assessment and definition that works around the clock is key to preventing new attacks.

3. Platform visibility. The creation of a monitoring apparatus that is agnostic is vital in today’s software platform environment of continuous updates, open source and self-imposed redundancy.

4. Network encryption. Point-to-point and point-to-multipoint encryption should be based on network segments, network protocols and network flows. In other words, internal networks in their entirety must be encrypted to ensure security long term.

5. Automated remediation. The end-goal of IoT security should be an approach that requires no human intervention. Automated, immediate security control utilizing machine-to-machine intelligence is a key to not only a successful, but also cost-effective unified security strategy.
IoT growth poses challenges for the forward-thinking CISO as scale increases, scope broadens and the need for cohesive cooperation increases. Those who consider the above drivers can develop a security strategy that will address these challenges and pave the way for the organization to take advantage of the opportunities the IoT also brings.

Tuesday, 3 November 2015

Key Security Tactics to Help Protect Your Business from a TalkTalk hack

Isaac George, SVP & regional head of digital transformation company Happiest Minds UK, discusses the increased number of security threats UK organisations are exposed to following the TalkTalk hack.
Cyber crimes are not only occurring with mounting frequency in today’s wireless world, but they are also becoming increasingly sophisticated and widespread.
Just this month, major UK telecommunications, internet access and mobile network services company TalkTalk was the latest in a long line of brands to face media scrutiny after its website was breached by a significant and sustained cyber-attack.
The company said it was “too early to say” how many of its customers had been affected by the attack but credit card, bank account details, names, addresses, dates of birth, email addresses and telephone numbers could all have been accessed.
With a criminal investigation now underway, it is not yet known what the nature of the attack was, although early insight suggests that it may have been a distributed denial of service (DDoS) attack, where a website is hit by waves of traffic so intense that it cannot cope.
However, a second school of thought believes that the DDoS attack may have been a smokescreen to distract the organisation’s defence team whilst the cyber criminals set in practice their real objective of stealing data.
Should the second school of thought be accurate, this may even have been an Advanced Persistent Threat (APT).
What sets Advanced Persistent Threats (APTs) apart is the nature and scope of the attack as they stealthily exploit vulnerabilities over a period of time.
Gartner puts it simply:
‘Advanced’ means it gets through your existing defences.
‘Persistent’ means it succeeds in hiding from your existing level of detection.
‘Threat’ means it causes you harm.
Once inside the network, APTs move around surreptitiously, seeking out sensitive data rather than disrupting systems and raising red flags.
These attacks are well coordinated and have very specific objectives that target key users within the organisation to gain access to high-value information – be it top-secret military or government documents, trade secrets, blueprints, intellectual properties, source codes and other confidential information.
Security, Mobile network services company, disrupting systems, cyber security systems, cloud services.

The worst part is that no organisation, irrespective of size or type, is immune to these attacks.
What is clear, whether it turns out to be DDoS, APT or another means of cyber-attack, the bottom line is that many of today’s businesses are relying on basic security defences like firewalls, anti-viruses and spyware that are dealing with APTs, and other means of attack, conceived years ago.
Which means it is only a matter of time before our traditional cyber security systems will be faced with the next generation of attacks and it is unlikely that they will succeed.
It is now imperative to develop a layered security approach that will amp up the security arsenal with a 360 degree visibility into all corners of the network.
Forewarned is forearmed – Key elements to APT defence
Unfortunately, there is no magic wand to combat APTs. The stealthy and random nature of APTs makes it a daunting task to predict attacks. Daunting, but not impossible.
The time has come for organisations to move beyond a perimeter-based ideology to a more comprehensive and multi-layer security approach that ensures continual protection even in the case of a breach. The critical elements to a successful APT defence lies in an intelligent combination of defence, analytics and a proactive incident response plan.
1. Know what to protect

The first step in any APT defence strategy is knowing what assets to protect. Once this data is sorted and classified, it provides a bird’s eye-view of pieces of your infrastructure across storage, security and accessibility across devices and endpoints.
2. Assess your security loopholes

The next step is to identify and categorise the most-at-risk information systems and high liability assets that link back to critical data. Assessing these systems enable us to prioritise protection and remedial plans against potential vulnerabilities. It is especially important that risk assessment is an on-going process to keep abreast with the ever-evolving threat landscape.
3. Shore up monitoring and detecting capabilities

Comprehensive monitoring off all inbound, outbound and internal traffic network is imperative to contain the scope and impact of a potential attack. Additionally, advance detection and real-time analytic tools in conjunction with traditional security solutions enable organisations to identify malicious activities as and when they occur.
A truly effective solution lies in the ability to differentiate normal and anomalous traffic patterns or activities generated by any IP-based device that connects to the network. By applying threat intelligence through analytics, these real-time insights allow for immediate isolation and remediation to stop the attack in the early stages.
4. An informed user is a safe user

The fact that APTs are often employed in the form of phishing emails, employees are the most susceptible targets. It does not take much to trigger a malicious code through an enticing link or attached file.
Security education and training makes employees aware of the potential security pitfalls of BYOD and cloud services. It also places some level of responsibility on the employees themselves to ensure that sensitive data remains secure.
5. Put an APT incident response plan in place

It is absolutely vital for an organisation to have a carefully crafted and up-to-date incident response plan in place.
It helps guides the organisation in quick identification and response in controlling a potential breach. This is what ultimately determines the effectiveness of the organisation’s response to an attack.
Staying ahead of the APT curve

The complex nature of APTs pose huge challenges to our standard security defence systems. On the flip side, they provide a much-needed impetus to reassessing frameworks and utilising solutions that are scalable to protect the entire organisation.
This latest attack against TalkTalk’s website is a huge wakeup call to the business community at large around the perils of delaying taking positive action against cybercrime. Is it not easy to secure your business against every type of attack, but the fact remains that a multi-pronged and layered approach to security is no longer an option but a must-have.
If you need convincing, you only have to look at the huge financial and reputational losses that will ensue for TalkTalk.

Isaac George is the SVP and regional head at infrastructure, security and product engineering services company Happiest Minds UK