alt =""

Friday, 20 May 2016

A Key Element of Digital Strategy Today is Content Monetization

One global market and technology research firm is increasing revenue dramatically through innovative digital content monetization.  And by using a SaaS platform they’ve achieved this in weeks instead of months.
The company’s clients purchase high value market and tech intelligence reports and they are more than satisfied with the quality and quantity of the content they receive.  Yet there are limitless opportunities to enrich existing content in ways that add value for their clients, thereby increasing revenue.  With automated micro-segmentation, the company is also providing slimmed down reports instantly, for more price conscious clients.  This has opened up a new market segment and a completely new revenue stream.
How might your business apply these lessons and increase content-based revenue?
Here are three steps to get started. 
First, analyze the content that you generate now for different customer segments.
Second, ask yourself how you could customize content for high value customers in ways that they could realize more value and be willing to pay more.  Take advantage of a SaaS platform that can do this for you automatically.
Third, consider some cost conscious customer segments that may have an interest in your content but you’re not serving now because your initial cost is too high for them.
If so, you can offer free sample report summaries to this market segment.  And perhaps offer a discount on the first few purchases to prove your value to them.
I’d love to hear your thoughts on monetizing content.  In this fast-developing world of digital transformation, we can all benefit by sharing ideas with other people.

Wednesday, 6 April 2016

Happiest Minds Saves $175,000 by Going Paperless

Happiest Minds Technologies is taking small steps to digitize all departments. We spoke to CIO Darshan Appayanna on what triggered the idea to make their office paperless.

Consider this: It’s 7:00 p.m. and your boss is breathing down your neck to finish the job in an hour’s time. This gives you a panic attack. All the papers are scattered on the table and the very thought of searching for the required document gives you goose bumps. You faint and the boss gives you the pink slip.

Ever thought about working in an office which is paperless? Well, if you haven’t, then the time has come to change your thinking as paperless office has no longer become a willing suspension of disbelief.

Happiest Minds Technologies was quick to take advantage of this and took the plunge to make many workflows paperless. Account payables, accounts receivables, contract and legal teams have digitized their workflow.

At the same time, the groundwork has been laid to digitize everything in the HR department. This will become operational in the second half of 2016.

When did the idea strike Happiest Minds Technologies to make departments in the office paperless? Talking about the old system, Darshan Appayanna, CIO, Happiest Minds Technologies says, “When the exit process is triggered on the last working day of employees, they carry a couple of sheets of paper and meet the finance, IT, facilities, and HR team to check if any dues need to be recovered or paid. Completing all the checks from the departments, it finally gets signed.”
This caused a lot of frustration and wastage of time as employees had to run from pillar to post.
So, what did the company do? “We automated work through the common workflow platform called ‘Smile Sign’. Our workflow platform is our primary foundation, which has been built to ensure that anything which needs audit trail is routed through the workflow mechanism in order to have efficient access to information,” says Appayanna.

Before digitization in finance, when it came to approval of cab service, everyone would send an email and fill up a form where they mention the pickup route. But now, there are routes which are defined in the system and once it goes to the admin, they will do the allocation. If they are fine with the workflow, they will give an approval.

With respect to travel and expense management, “We have made sure that people don’t have to submit physical copies of bills. Scanned copies are accepted. At the same time, re-imbursements and expense settlement are also digitized.

Also, any invoice which comes from the vendors are digitized and accessed by the finance team. Unwieldy folders have become a thing of the past. Appayanna says that people are not wasting time to open a storage room to search for files or dig out invoices.

Coming to the HR department, the company has begun to automate employee dockets. “We are introducing a new platform where, at the time of joining, people can send their digital copies of mark sheets instead of taking a printout,” says Appayanna. This platform is ready, and will be rolled out by the second half of 2016. The advantage is that everything is searchable.

Appayanna, says: “It reduces bureaucracy as people don’t have to run around. They can do the work at their own convenience.”

Appayanna said that, earlier, when the auditors were in place, the finance and HR team spent a lot of time searching for documents. “Now we give them access to our repository, which is catalogued. When any paper comes, it is uploaded and tagged appropriately with the reference number to a particular department. With this, they can search with the reference number and get the listed documents they wish to look at.”

It has been a year and three months since the company launched this project and they have gained fruitful dividends. Highlighting a major gain, Appayanna says, “We have saved $175,000 yearly in terms of documents being made available and acceptable at any time.”

Tuesday, 29 March 2016

Happiest Minds’ ThreatVigil Wins GOLD at the Infosecurity Product Guide’s Global Excellence Award 2016

Happiest Minds Technologies, a next generation digital transformation, infrastructure, security and product engineering services company, today, announced that ThreatVigil, its cloud-based threat management solution, won top honours in the Infosecurity Product Guide’s Global Excellence Awards. The solution was recognized in the ‘GOLD’ category for Vulnerability Assessment, Remediation and Management.

ThreatVigil is an on-Demand cloud-based threat management solution that comprises of vulnerability assessments and penetration testing of all the system components that include business applications, databases, secure network perimeters, systems and network infrastructure, mobility solutions and virtualized cloud environments. Developed with a combination of industry proven automated tools and in-depth manual assessment techniques, it is highly scalable and offers faster and simpler deployment options with no dependencies.

"We are honoured to be recognized as an industry leader yet again by the Info Security Products Guide Global Excellence Awards. We are seeing increasing global demand for our cloud-based security solutions and these recognitions from industry forums reinforce the impact we are making with our IP based solution accelerators. This award reinforces our commitment to provide an innovative and pragmatic approach for enterprises to help them protect themselves against the dynamic and emerging threat landscape,” said Prasenjit Saha, President, Infrastructure Management Services and Security Business, Happiest Minds Technologies.

The security industry celebrated its 12th Annual Global Excellence Awards in San Francisco by honouring excellence in every facet of the industry including products, industry leaders and best companies. There was participation from many well-recognized companies under 45 categories, and more than 50 judges from a broad spectrum of industry voices around the world analyzed these nominations. An average of their score determined the finalists and winners that were announced during the awards dinner and presentation attended by the finalists, judges and industry peers.

About Happiest Minds Technologies:

Happiest Minds enables Digital Transformation for enterprises and technology providers by delivering seamless customer experience, business efficiency and actionable insights through an integrated set of disruptive technologies: big data analytics, internet of things, mobility, cloud, security, unified communications, etc. Happiest Minds offers domain centric solutions applying skills, IPs and functional expertise in IT Services, Product Engineering, Infrastructure Management and Security. These services have applicability across industry sectors such as retail, consumer packaged goods, e-commerce, banking, insurance, hi-tech, engineering R&D, manufacturing, automotive and travel/transportation/hospitality.

Headquartered in Bangalore, India, Happiest Minds has operations in the US, UK, Singapore, Australia and has secured US $52.5 million Series-A funding. Its investors are JPMorgan Private Equity Group, Intel Capital and Ashok Soota.

About Info Security Products Guide:

Info Security Products Guide plays a vital role in keeping end-users informed of the choices they can make when it comes to protecting their digital resources. It is written expressly for those who are adamant on staying informed of security threats and the preventive measure they can take. You will discover a wealth of information in this guide including tomorrow's technology today, best deployment scenarios, people and technologies shaping info security and market research reports that facilitate in making the most pertinent security decisions. The Info Security Products Guide Global Excellence Awards recognize and honour excellence in all areas of information security. To learn more, visit www.infosecurityproductsguide.com and stay secured.

Media Contact:

Sunday, 13 March 2016

Top 5 Reasons for investing in Customer Experience

Gone are the days when the key to success in business is determined by a premium quality product/ service, value for money and good customer service. In this age of extreme competitiveness led by disruptive technologies and the allied digital transformation services, the key to success of any business lies in the Customer Experience that you are delivering. With the widespread reach of social media and real time interactions via the internet, the room for customer expectations has become broaden. The Customer Experience in this digital transformation era represents more of a cumulative experience of multiple touch points which results in a long term real relationship between the business and the customer. But how to create and deliver the most appealing customer experience is the question the business world is facing now.

Let us have a look at the Top 5 reasons for investing in Customer Experience
  1. Drive loyalty: Enhance brand loyalty through engaging programs and gamification
  2. Increase Revenue: Develop Omnichannel experience to create multiplier effect
  3. Improve Customer Service: Understand 360 degree customer view to contextualize interactions
  4. Reduce Customer Churn: Know how to keep your customer sticking to your brand
  5. Competitive Advantage: Enhance data capturing analyzing capabilities to gain a competitive advantage

The business value of a great customer experience is enormous which prompts the global businesses for putting the strategy, funds and processes in place to build an effective customer experience practicing. Those who are making necessary changes to strategically prioritize CX will definitely win an upper edge over the competitors.

Thursday, 3 March 2016

Transforming From Traditional IAM to Business Driven IAG

Providing the right people with the right access at the right time is critical in any organizational environment, irrespective of its size. In this age of explosive growth in network communications, increasing collaboration and policies like BYOD it is challenging for enterprises to determine who all have access to what resources and what they are doing with their access. A comprehensive governance control is essential to reduce the risks relating to unauthorized access, mishandling of sensitive data which can take a toll on the reputation of the organization. It is also critical to comply with governance regulations that mandate access controls.

Traditional IAM (Identity and Access Management) is focused on access management, provisioning and de-provisioning related compliance. Enterprise still struggled to meet compliance, since this is not an all-inclusive solution. It focusses more on automation of the user life cycle. Traditional IAM implementations are IT driven rather than business driven. Provisioning driven approach rarely achieve expected business value. Traditional IAM is not involved in user access review or periodic user access certification. The classic example is a user requested and granted accesses for a critical application for a temporary time period, in this aspect zero visibility on unwanted access and its usage. Governance driven IAG gives you real-time visibility into access changes.

Historically, IAM systems are used in IT organizations for managing the life cycle of user accounts in multiple systems. These systems are connected to user directories to get the user for their authentication and basic profiles such as name, title, department etc. With this information, IAM can tell who the user is, but it cannot give you information about a user’s entitlements- which is key to an application as it will decide what each user can do with application and data. The challenge with provisioning driven approach is – for e.g if a user request and get access for an access for a CRM application. If the access is controlled using a group or entitlement, traditional IAM will provision the user to entitlement, but it doesn’t provide the visibility to what the user exactly can do in CRM using this entitlement.

IAG (Identity and Access Governance) systems help business people to determine what a user can do within an application. It collects information about user identities, entitlements and roles from all applications. In addition, IAG will provide more visibility of an entitlement in applications and it will present information about each entitlement in a business context rather than technical context. This will help business managers to understand the entitlements that the users request for and this will enhance the compliance to applications.

Governance driven IAG is more concentrated on a risk driven approach. Also it is more focused on entitlement management and this can provide a more granular level of visibility of user access. This approach will enable periodic user access review and certification of user access. Governance driven IAG focusses more on the fast integration of applications across multiple platforms and provide more visibility of user access. This model ensures appropriate access for all users and ,\ automate user access review process and also simplifies the provisioning and de-provisioning problem.


In today’s complex IT landscape where solutions are dependent on multiple heterogeneous platforms and enterprise applications extend their presence into mobile and cloud space, tighter regulatory controls are required to protect the enterprise data from unauthorized access. Governance driven Identity and Access management allow organizations to review, audit and enforce policies for fine-grained access privileges across the IT environment. It can also bring in end-to-end visibility and control across all critical systems and applications – a breadth of coverage that is more efficient and reliable than traditional IAM solutions.

Wednesday, 27 January 2016

How to Select the Right Set of Devices for Mobile Testing?

Mobile phones have been a great revolution of mankind. Big static land phones have slowly got obsolete and mobile phones are now an indispensable part of our everyday life.  In today’s world, almost every person owns, at least, one phone and a few have more than one. The smart phones’ entry into the market have made people go crazy and it is the gadget in which people are so dependent that without it they would feel lost. It is now very evident for almost all the industries that it is easy to reach customers via apps. Most of the mobile solutions are dependent on the new age disruptive technologies. Compared to other computing devices the reach of mobile/ smart phones is huge. With its accessibility/ availability to the high profile business magnets to a road side tea seller, mobile phones have filled up the so- called “digital divide” to a larger extent.

Challenge
From its traditional role as a mere communication tool, mobile phones have now become multipurpose gadgets used for both personal and professional uses, which creates opportunities as well as challenges like both sides of a coin. Technology shifts, proliferation of devices/ operating systems are creating challenges for hardware manufacturers and application developers in terms of developing and rolling out new products or updating it. The mobile application testing across various devices and platforms has now become even more challenging. As there are quite a lot of mobile makers in the market it is almost impossible to ensure that proper mobile testing is done on all the devices, and to a certain extent, it is not required as well. Digital modernization has encouraged people of any age group to manage their important data or images in the cloud, rely on apps that can work as reminders, use messengers to keep in touch and many more. As the wants have now become basic needs, one of the key areas that should be in focus is the Customer Experience. The user’s geo, age group, and the targeted group of customer’s info is crucial in deriving the best customer experience the user can have. However, yet another point to be remembered is that in the competitive mobility  arena ‘go to market’ time has reduced much and if you delay, someone else will take over that place. Hence, quality has to be ensured in a short span. These facts will drive anyone to be choosy about the devices when it comes to validation.

Let us have a look at the various devices that need to be considered for any mobile application testing.

Solution
By considering a couple of parameters, we can nail down on the devices on which app needs to be tested.

Parameters to be considered are –
1.    Type of devices
2.    Form Factors
3.    OS

Type of devices –
Now a days, most of the apps are made available in almost all types of devices and hence we need to ensure that the User Interface and User Experience Testing parameters for the app are met in all types of devices including phones, tablets and phablets.

Form Factor –
In most of the cases, the size of the screen is miss-understood with the resolution of the screen. These are the two exclusive parameters to be considered. Resolution is the number of pixels on the screen, irrespective of the screen size. How the app looks and objects placement on the screen are dependent on this parameter. There could be two devices of screen size 5 inch but their resolution might differ.

Mobile Testing

OS (iOS, Android, Windows and BlackBerry) –
The app is always built with the combination of new development and third party features or services, in the end product. There is a possibility where a developer would have tweaked third party features to meet the product’s requirement. Hence, we need to ensure that the combination of all these features works fine in multiple OS and their different versions that are majorly in use. There is no point in testing outdated versions of OS as the users will keep on moving towards the updated versions. But this has to though through as to which is the oldest version that we need to support for.

Matrix of devices shortlisted for testing
Note –

1.    Below information is as per Gsmarena.com.
2.    Only iOS and Android is considered in the below matrix to explain the exercise.

Type
Device Resolution OS versions Size (inch) Test Type
Android Tab Nexus 10 2560 x 1600 5.X.X 10 Func & UX
Android Tab Nexus 7 800 x 1280 5.X.X 7 UX
Android Tab Micromax Canvas Tab P470 600 x 1024 pixels 4.4.x 7 Func & UX
Android Phones Sony Xperia Z5 2160×3840 5.1.x 5.5 Func & UX
Android Phones Nexus 6P 1440X2560 6.x 5.7 Func & UX
Android Phones Samsung Galaxy S4 1080×1920 4.2.2 5 Func & UX
Android Phones Moto G 720×1280 5.1.1 5 UX
Apple Phones iPhone 5s 640 x 1136 iOS 7 4 Func & UX
Apple Phones iPhone 5s 640 x 1136 iOS 9.x 4 Func
Apple Phones iPhone 6 750 x 1334 iOS 8.x 4.7 Func & UX
Apple Phones iPhone 6s 750 x 1334 iOS 9.x 4.7 UX
Apple Phones iPhone 6s Plus 1080 x 1920 iOS 9.x 5.5 UX
Apple Tab iPad Air 2 1536 x 2048 iOS 9.x 9.7 Func & UX
Apple Tab iPad mini 2 1536 x 2048 iOS 8.x 7.9 Func

Though the above list is handpicked list of devices, it looks exhaustive and very difficult to test in all of them. The idea here is to cover all the form factors, OS and types of devices with different brands and hence the list seems to be big.

There is no shortcut if we have to validate the functionality in different OS versions and User Interface & User Experience factors in different form factors. Hence, the combination of devices and OS selection are done keeping these facts in mind. For different versions of OS, functionalities are validated to ensure that the newly developed piece of code and third party features are working fine without any functional flaws. For different Form Factors, UX parameters are validated to make sure that all the object in the screen are fitting properly  inside the screen as per the decided mock ups and there are no overlaps or partially hidden objects.

While we do functional validation on different devices it is obvious that you will make out the UI and UX glitches. So when you are testing only User Experience related scenarios you would know what is covered along with the functional testing and more focus has to be shown in the other areas.
One should always keep an eye on the market to know about the new devices or versions of OS or browsers that come to market and see if they fit into the above table. With this exercise, it is easy to arrive at the devices to be considered for testing.


Despite the short development cycles, go to market pressures and increasing competition in the mobility arena, it is key to do the mobile application testing across multiple devices and platforms and it is daunting too. Effective and timely mobile testing can enable device makers and application developers in collecting appropriate metrics that improve the overall quality of products and will be able to deliver amazing customer experiences.

Thursday, 21 January 2016

Staying Afloat During a Cyber-Attack

Given the rising frequency of increasingly malicious and innovative cyber-attacks, one can safely conclude that cyber risk is here to stay. It is no longer a question of ‘if’ but ‘when’ your organization will have to deal with a cyber-attack. The cost of a cyber security breach is significant—in terms of money, business disruption and reputation. Depending on the magnitude of the attack, a cyber incident can potentially put you out of business.

The best course of action for a business that is attacked is a swift and effective response. A cyber security strategy with efficient incident response (IR) capabilities coupled with customer engagement initiatives helps limit the damage and ensures that the business is up and running as soon as possible. Reaching out and engaging with customers reassures them, and helps a business that’s dealing with a cyber-attack to regain customer confidence, and prevent defection.

An effective IR strategy navigates the following phases:

Identify
Information on events is collected from various sources such as intrusion detection systems and firewalls, and evaluated to identify deviations from the normal. Such deviations are then analyzed to check if they are sufficiently significant to be termed an event. The use of automation tools ensures swift detection and eliminates delays in moving to the containment phase. Once a deviation is identified as a security incident, the IR team is immediately notified to allow them to determine its scope, gather and document evidence, and estimate impact on operations. Businesses can bolster this process by incorporating an effective security information and event management (SIEM) system into their cyber security strategy.
Contain
Once a security event is detected and confirmed, it is essential to restrict damage by preventing its spread to other computer systems. Preventing the spread of malware involves isolating the affected systems, and rerouting the traffic to alternative servers. This helps limit the spread of the malware to other systems across the organization.

Eliminate
This step focuses on the removal of the malware from the affected systems. IR teams then conduct an analysis to find out the cause of the attack, perform detailed vulnerability assessment, and initiate action to address the vulnerabilities discovered to avert a repeat attack. A thorough scan of affected systems to eradicate latent malware is key to preventing a recurrence.

Restore
In the restoration stage, affected systems are brought back into action. While bringing the affected systems back into the production environment, adequate care should be taken to ensure that another incident does not occur. Once these systems are up and running, they are monitored to identify any deviations. The main objective is to ensure that the deficiency or the vulnerability that resulted in the incident that was just resolved does not cause a repeat incident.

Investigate
This is the last step and entails a thorough investigation of the attack to learn from the incident, and initiate remedial measures to prevent the recurrence of a similar attack. IR teams also undertake an analysis of the response to identify areas for improvement.

What enterprises need now are effective cyber security solutions to monitor and provide real-time visibility on a myriad of business applications, systems, networks and databases. There has been an increasing realization that basic protection tools for important corporate information are no longer sufficient to protect against new advanced threats. Furthermore, enterprises are under tremendous pressure to collect, review and store logs in a manner that complies with government and industry regulations.


Countering focused and targeted attacks requires a focused cyber security strategy. Organizations need to take a proactive approach to ensure that they stay secure in cyber space and adopt a robust cyber security strategy.

Monday, 7 December 2015

Disruptive Technology Weekly Roundup – Dec 1st to Dec 7th

The prevention, detection and response to cyber security in 2016 will view a sea of changes, says a new report from Forrester Research. According to Forrester, the five cybersecurity predictions and resulting actions to be taken in 2016 are as follows: In this disruptive technologies era, were wearables and IoT is expected to be more prevalent, the security and risk professionals should focus and reexamine the existing security functions in through a new angle. They should consider the human factor also while addressing the security threats. The second prediction is on Governments security capabilities. The research firm has given a bleak assessment of the security capabilities of US government, which is short staffed, under-budgeted and lacking internal discipline. The third prediction was about the expected increase of security and risk spending by 5 to 10 % in 2016. Fourth comes the defense contractors’ prospective entry to private industry with claims regarding ‘Military grade’ security. However, Forrester warns private players to thoroughly watch the commercial experience and their commitment before acquiring them. The fifth prediction covers the HR department that they will bring in identity and credit protection and resolution services as an employee benefit, in this era of increasing fraud, identity theft, medical identity theft and damage to personal online reputation. Read More:

As the holiday season is coming up, the cyber security researchers in the US warns about a malware, ModPOS, which is largely undetectable by current antivirus scans. The firm also points that the malware has infected even some of the national retailers. According to the researchers, it is one of the most sophisticated point-of-sale malware with a complex framework which is capable of collecting a lot of detailed information about a company, including payment information and personal log-in credentials of executives. To address the threat, the companies need to use more advanced forms of encryption to protect consumer data. Point-to-point encryption where a consumer’s payment card data is unlocked only after it reaches the payment processor is one such effective method to combat the malware threat. Security experts warn that without such protections, even new credit cards with a chip technology known as EMV could still be compromised by infected point-of-sale systems. Read More:

The information security landscape is continuously evolving, with the proliferation of disruptive technologies like mobile, social, cloud and big data have been increasingly impacting protection strategies. In-depth strategies to monitor, analyse and report security incidents is paramount to deliver an effective enterprise security risk management profile. Happiest Minds with our deep expertise in security arena along with a large pool of experienced security professionals brings in security solutions that address the key challenges faced by enterprises today. Our services aim to improve the agility, flexibility and cost effectiveness of the next generation needs of information security and compliance programs.

How Do You Solve a Problem Like Cyber Security?

Happiest Minds UK discusses the new-age deception technologies UK businesses should adopt to bolster theircyber-security defences
The recent TalkTalk cyber-security breach has brought the issue of security firmly back into the public’s psyche and has put both government and organisations on high alert. It seems that regardless of your vertical market, be it finance, technology or banking, the threat of a cyber breach is pretty much imminent. Only today I read an article which outlined that Britain’s Trident nuclear weapons system may be vulnerable to cyber-attack by a hostile state, according to former defence secretary Des Brown.
So, despite the UK being one of the highest EU spenders on IT security, existing cyber security solutions are simply not good enough to stop malicious hackers and evolving threats. It’s little wonder why Chancellor George Osborne has pledged to spend an additional £1.9 billion on cyber security and has committed to the creation of a ‘National Cyber Centre’ to respond to major attacks on Britain.
So, how do you solve a problem like cyber security? Well, the answer could well be to implement emerging deception technologies such as next-generation honeypots and decoy systems which, according to a new Gartner report entitled ‘Emerging Technology Analysis: Deception Techniques and Technologies Create Security Technology Business Opportunities’, could have a game changing impact on enterprise security strategies.
Deception technologies are effectively tools which deceive attackers and enable the identification and capture of malware at point of entry. They misdirect intruders and disrupt their activities at multiple points along the attack chain by luring them towards fake or non-existent data and away from the organisations critical data.
Let us look at a few of these technologies in greater detail:
Honeypots—or software emulations of an application or server—have been around for a few years now. A honeypot works by offering ‘honey’, something that appears attractive to an attacker, who will then expend his resources and time on gathering the honey. In the meanwhile, the honeypot does an admirable job of drawing his attention away from the actual data it seeks to protect.
Decoys are similar to honeypots and cause the attacker to pursue the wrong (fake) information. Many decoys act together to fill the attacker’s radar in a manner as to render it difficult for him to differentiate between real and fake targets.
However, organisations are now looking for more active defence strategies that not only lure in attackers, but also trap them, confound them and track their activity. One such deception technology offers an emulation engine masquerading as a run-of-the-mill operating system. The ‘operating system’ contains ‘sensitive’ data that could be attractive to attackers, for example data labelled ‘credit card info’. The platform will lure the attacker in by allowing him to ‘hack’ this fake data and in turn start gathering information about his movements and the codes that he seeks to modify. This intelligence can then be shared with other security tools, such as intrusion prevention systems, to defend against the attack.
A number of start-ups are designing various kinds of intrusion deception software that insert fake server files and URLs into applications. These traps are visible only to hackers and not normal users. An example of such a snare could be trapping hackers probing for random files by granting them access to bogus files that are a dead-end and merely keep leading them in circles towards more fake data. Or protecting the system against brute-force authentication by scrambling the attacker’s input so he can never get the password right, even if he does happen to type out the right code.
Other technologies set up fake IP addresses on webservers that, on multiple attempts to hack them, will always present a deception to that user. Other companies set up virtual systems or computers that actually have no data on them, and are indistinguishable from other machines on the network. Repeated intrusion into and unwarranted activity on these systems make it easy to identify hackers. The hackers’ movements and methods can then be analysed, and the data fed back into other threat detection solutions and tools.
Deception technologies therefore create baits or decoys that attract and deceive attackers, making it quicker for an organisation to detect a security breach. They increase the attacker’s workload and exhaust his resources. Certain solutions go beyond merely setting up decoys to also conduct forensic analysis on these attacks so the organisation can effectively defend its network and speedily mitigate security breaches. It may not be a ‘one size fits all’ answer to the cyber security conundrum, but it is certainly one more weapon in the organisation’s armory against hackers.

Wednesday, 25 November 2015

Taming the Elephant....Building a Big Data & Analytics Practice - Part I

A couple of decades ago, the data and information management landscape was significantly different. Though the core concepts of Analytics, in a large sense, has not changed dramatically, adoption and the ease of analytical model development has taken a paradigm shift in recent years. Traditional Analytics adoption has grown exponentially and Big Data Analytics needs additional and newer skills.

For further elaboration, we need to go back in time and look at the journey of data. Before 1950, most of the data and information was stored in file based systems (after the discovery and use of punched cards earlier). Around 1960, Database Management Systems (DBMS) became a reality with the introduction of hierarchical database system like IBM Information Management and thereafter the network database system like Raima Database Manager (RDM). Then came Dr. Codds Normal Forms and the Relational Model. Small scale relational databases (mostly single user initially) like DBase, Access and FoxPro started gaining popularity.

With System R from IBM (later becoming the widely used Structured Query Language  database from which IBM DB2 was created), and ACID (Atomicity, Consistency, Isolation, Durability) compliant Ingres Databases getting released, commercialization of multi-user RDBMS became a reality with Oracle and Sybase (now acquired by SAP) databases coming into use in the coming years. Microsoft had licensed Sybase on OS2 as SQL Server and later split with Sybase to continue on the Windows OS platform. The open source movement however continued with PostGreSQL (an Object-Relational DBMS) and MySQL (now acquired by Oracle) being released around mid 1990's. For over 2 decades, RDBMS and SQL grew to become a standard for enterprises to store and manage their data.

From 1980's, Data Warehousing systems started to evolve to store historical information to separate the overhead of Reporting and MIS from OLTP systems. With Bill Inmon's CIF model and later Ralph Kimball's popular OLAP supporting Dimensional Model (Denormalized Star & Snowflake schema) gaining popularity, metadata driven ETL & Business Intelligence tools started gaining traction, while database product strategy promoted the then lesser used ELT approach and other in-database capabilities like in-database data mining that was released in Oracle 10g.  For DWBI products and solutions, storing and managing metadata in the most efficient manner proved to be the differentiator. Data Modeling Tools started to gain importance beyond desktop and web application development. Business Rules Management technologies like ILOG JRules, FICO Blaze Advisor or Pega started to integrate with DWBI applications.

Big Data Analytics, Business Intelligence,

Once the Data Warehouses started maturing, the need for Data Quality initiatives started to rise, since most Data Warehousing development cycles would have used a subset of the production data (at times obfuscated / masked) during development and hence even if the implementation approach would have included Data Cleansing and Standardization, core DQ issues would start to emerge post production release to even at times render the warehouse unusable till the DQ issues were resolved.

Multi-domain Master Data Management (Both Operational or Analytical) / Data Governance projects started to grow in demand once organizations started to view Data as an Enterprise Asset for enabling a single version of truth to help increase business efficiency and also for both internal and at times external data monetization.  OLAP integrated with BI to provide Ad-hoc reporting besides being popular for what-if modeling and analysis in EPM / CPM implementations (Cognos TM1, Hyperion Essbase, etc.)

Analytics was primarily implemented by practitioners using SAS (1976) and SPSS (1968) for Descriptive and Predictive Analytics in a production environment and ILOG (1987) CPLEX, ARENA (2000) for Prescriptive Modeling including Optimization and Simulation. While SAS had programming components within Base SAS, SAS STAT and SAS Graph, the strategy evolved to move SAS towards a UI based modeling platform with Enterprise Miner and Enterprise Guide getting launched, products that were similar to SPSS Statistics and Clementine (later IBM PASW modeler) which were essentially UI based drag-drop-configure analytics model development software for practitioners usually having a background in Mathematics, Statistics, Economics, Operations Research, Marketing Research or Business Management. Models used sample representative data and a reduced set of factors / attributes and hence performance was not an issue till then.

Around mid of last decade, if anyone had knowledge and experience with Oracle, ERWin, Informatica and MicroStrategy or competing technologies, they could play the role of a DWBI  Technology Lead or even as an Information Architect with additional exposure & experience on designing Non Functional DW requirements including scaleability, best practices, security, etc.

Sooner, the enterprise data warehouses, now needing to store years of data, often without an archival strategy, started to grow exponentially in size. Even with optimized databases and queries, there was a drop in performance. Then came Appliances or balanced / optimized data warehouses. These were optimized database software often coupled with the operating system and custom hardware. However most appliances were only supporting vertical scaling. However, the benefits that appliances brought were rapid accessibility, rapid deployment, high availability, fault tolerance and security.
Appliances thus became the next big thing with Agile Data Warehouse migration projects being undertaken to move from RDBMS like Oracle, DB2, SQL Server to query optimized DW Appliances like Teradata,  Netezza, GreenPlum, etc. incorporating capabilities like data compression, massive parallel processing (shared nothing architecture), apart from other features. HP Vertica, which took the appliance route initially, later reverted to become a software only solution.

Initially Parallel Processing had 3 basic architectures – MPP, SMP and NUMA. MPP stands for Massive Parallel Processing, and is the most commonly implemented architecture for query intensive systems. SMP stands for Symmetric Multiprocessing and had a Shared Everything (including shared disk) Architecture while NUMA stands for Non Uniform Memory Architecture which is essentially a combination of SMP and MPP. Over a period of time, the architectures definitions became more amorphous as products kept on improvising their offerings.

While Industry and Cross-Industry packaged DWBI & Analytics Solutions became increasingly a Product and SI / Solution Partner  Strategy, end of last decade started to see increasing adoption of Open Source ETL, BI and Analytics  technologies like Talend, Pentaho, R Library, etc. adopted within industries (with the only exceptions of Pharma & Life Science and BFSI Industry groups / sectors), and in organizations where essential features and functionality were sufficient to justify the ROI on DWBI initiatives that were usually undertaken for strategic requirements and not for day to day operational intelligence or for  insight driven additional or new revenue generation.

Also, cloud based platforms and  solutions adoption and even DWBI and Analytics application development on  private or public cloud platforms like Amazon, Azure, etc. (IBM has now come with BlueMix and DashDB as an alternate) started to grow as part of either a start-up strategy or cost optimization initiative of Small and Medium Businesses and even in some large enterprises as an exploratory initiative, given confidence on data security.

Visualization Software also started to emerge and carve a niche, growing in increasing relevance mostly as a complementary solution to the IT dependent Enterprise Reporting Platforms. The Visualization products were business driven, unlike technology forward enterprise BI platforms that could also provide self-service, mobile dashboards, write-back, collaboration, etc. but had multiple components with complex integration and pricing at times.

Hence while traditional enterprise BI platforms had a data driven "Bottom Up" product strategy, with dependence and control with the IT team, Visualization Software took a business driven "Top Down" Product Strategy, empowering business users to analyze data on their own and create their own dashboards with minimal or no support from the IT department.

With capabilities like geospatial visualization, in-memory analytics, data blending, etc. visualization software like Tableau is increasingly growing in acceptance. Some others have blended Visualization with out-of-box Analytics like TIBCO Spotfire and in recent years SAS Visual Analytics, a capability which otherwise is achieved in Visualization tools mostly by integrating with R.

All of the above was manageable with reasonable flexibility and continuity till data was more or less structured and ECM tools were used to take care of documents and EAI technologies were used mostly for real-time integration and complex event processing between Applications / Transactional Systems.

But a few year ago, Digital platforms including Social, Mobile and other platforms like IOT/M2M started to grow in relevance and Big Data Analytics grew beyond being POCs undertaken as an experiment to thereafter complement an enterprise data warehouse (along with enterprise search capabilities), to at times even replace them. The data explosion gave rise to the 3 V dilemma of velocity, volume and variety and now data was available in all possible forms and in newer formats like JSON, BSON, etc. which had to be stored and transformed real-time.

Analytics had to be now done over millions of data in motion unlike the traditional end of day analytics over data at rest. Business Intelligence including Reporting and Monitoring, Alerts and even Visualization had to become real-time. Even the consumption of analytics models now needed to be real-time as in the case of  customer recommendations and personalization, trying to leverage smallest windows of opportunity to up-sell / cross-sell to customers.

It is Artificial Intelligence systems, powered by Big Data that is becoming the game changer in the near future and it is Google, IBM and others like Honda who are leading the way in this direction.
To be continued.........