Archive for the ‘Data Analytics’ Category

CoolChip Technology Benefits from ZENeSYS Competitive Landscape Analysis Methodology

September 23, 2011 Leave a comment

On August 20th and 21st, 32 future management consultants from top Business Schools in India participated in a ZENeSYS Boot Camp. The objective of the  boot camp was to learn ZENeSYS Competitive Landscape Methodology and write the best analysis for CoolChip Technologies.

Winner of MIT clean energy award and recipient of a DARPA grant, CoolChip Technologies has perfected its patented technology for rapid heat removal from high performance computer chips. They are now ready for its next level of growth.

The participants in the Boot Camp were tasked with using the ZENeSYS Competitive Landscape Analysis (CLA) methodology to provide an unbiased assessment of CoolChip’s product features as compared to its competitors. Three teams from Boot Camp submitted their analysis in the final round which prompted the following response from William Sanchez CEO of CoolChip Technologies.

“We had the opportunity to benefit from their novel methodology for competitive landscaping. The ZENeSYS analysis will be very useful for CoolChip’s next stage of growth as we seek our next round of financing. In-depth understanding of the competitive landscape is curial for positioning our company. The ZENeSYS systematic framework provides a clear, concise mulch-dimensional presentation of the competitive landscape. The tool is a priceless piece of machinery for processing unwieldy amounts of information. The rapid report generation and high standard of quality make the market intelligence very valuable and a resource CoolChip will continue to use as we explore new market verticals and require comprehensive analysis on incumbents and new entrants.”

About ZENeSYS Competitive Landscape Analysis (CLA): A methodology that has been perfected over two years to get the most up-to-date snapshot of market conditions as it relates to a client’s specific product or service. The methodology uses a framework based on competitor goals and customer’s preferences to synthesize news, blogs, forums, white papers, patents, press releases, testimonials, product literature, industry reports, books, and journals to identify best practices and opportunity areas. Startups develop insights for growth and investors get an unbiased assessment of investment risk.


The Great Liquidity Crunch – Private Equity Firms Re-invent Themselves

July 31, 2011 Leave a comment

By Vaibhav Gupta – ZENeSYS Certified Consultant, IIM Indore (2010-12 Batch). Vaibhav completed his B.Tech. in Mechanical Engineering from Delhi College of Engineering. Before his MBA, Vaibhav worked in Project Management for 4 years with Siemens Energy. Vaibhav intends to join the Management Consulting industry after his MBA.

Abstract: The liquidity crunch, tighter regulations and more control demanded by general partners is forcing the PE industry to re-invent itself.

The three key areas of transformation are:

  1.   More secondary deals – establishing the best valuations and finding the right buyers
  2.   Restructuring deals – creative ways of re-negotiating (past) unrealistic deals
  3.   Employing turnaround specialists – creating consultancy like capability for operational improvements

Over the last decade, uncharacteristic of what one might think, the private equity industry has gone through an economic cycle just like the stock market does – up and down. The up phase of the PE market started around 2003 and it continued to boom till 2007-2008 or just when the financial crisis kicked in.

During the up phase, which started after the telecom burst in 2002, the PE industry saw the largest leveraged buyout (LBOs) taking place in history. In the year 2006, PE firms had bought more than 650 US companies for a figure of $375 Million, which was around 18 times the transactions, closed in 2003 [ref 1] alone.

The LBO movement was fueled by decreasing interest rates, low lending standards and tight regulatory framework of owning a public enterprise (Sarbanes Oxley Act) [ref 2]. Consequently, it is one of the reasons why venture capitalists started relying on sales to strategic buyers for an exit.

The arrival of financial crisis in 2008 and the liquidity crunch brought about a quick halt to the LBO frenzy. The leveraged finance market almost stood still during and after the financial crisis. Not only that but as a result of this crisis, many deals were withdrawn or had to be re-negotiated [ref 3].

PE industry is measured with help of two metrics – fund raising and investment activities. The fund raising refers to the money, investors have committed to PE funds in a year. That fund raising activity had fallen to $150 Billion globally in 2009 from $450 Billion in 2008. Coincidentally, the 2009 figure is the lowest since 2004. The lack of debt in the following year of 2010 meant no hope of any speedy recovery.

The other metric, investment activity, which represents the financing of businesses, had fallen from $181 Billion globally in 2008 to just over $90 Billion in 2009. It 2010, it picked up to $110 Billion [ref 5]. This minor jump could possibly be attributed to increased investments in Small & Medium Enterprises and emerging markets such as Brazil, China and India.

Since PE funds acquire firms so that they could be sold at a profit later, life becomes difficult for them during an economic slowdown. Especially more so, where there is a liquidity crunch factor in addition. This was apparent from the total value of PE exit transactions. They fell from $151 Billion in 2008 to $81 Billion in 2009.

From here on to the next five years till 2016, over $800 Billion in loans extended on committed deals would become due or will have to be refinanced [ref 6]. This $800B is distributed almost equally between bank loans and high-yield bonds. To add further complication in hiving off these assets the US government has passed a bill that would require any PE firm which has more $150 Million in assets under management to register with SEC.

The implications are public disclosure of risks, business activities names of the personnel involved, assets owned, amount owed to each creditor, performance metrics, debt and defaults [ref 7].

According to estimates [ref 9], there is one trillion dollars worth of dry powder in PE funds globally. However, LPs are now demanding more control and requesting more information about their investments. The LPs want to keep track of the draw-down capital so that GPs don’t overdraw their limits [ref 8]. This in effect has created a liquidity crunch of sorts, within the PE community itself.

This puts PE funds in a tight spot. How should they service their existing debts and acquire new assets? The solution for servicing debts is to look for different options such as secondary markets, restructuring the deal or employ turn-around specialists to improve valuations. For acquiring new assets, they must look harder in the marketplace or find greener pastures in emerging markets.

Hence, turn-around specialists are in demand nowadays. PE firms are turning away from traditional leveraged deals and looking into investing in distressed companies. They feel it is better to restructure deals based on a change in strategy rather than to take money out and pursue matter in courts.

Turn-around is becoming more and more important as top lines (revenues) are shrinking. Even vendors have stopped extending favorable credit terms. Hence the success of any PE acquisition is down to operational excellence. This means improved management, optimizing expenditures, and rooting out inefficiencies such as overcapacity created in high growth years.

Interestingly, PE firms have identified a new gap in the market – companies, which do not have the means to hire expensive management consultants, are now finding this as a welcome opportunity to bring aboard high quality leadership.

Emerging markets such as Brazil, China and India are still attractive and PE firms need to find ways of entering them. China and India are the two fastest growing economies even during the recession and they need to develop infrastructure to support the high growth. Brazil is set to hold 2014 World Cup and 2016 Olympics.

With increasing pressure from the regulators, lack of liquidity, and tighter control demanded by general partners, future PE deals would need to be financed and executed with better insights and strategic planning. This means deeper research before deals are struck, awareness of best practices in target markets for operational excellence and market intelligence for finding the right buyers quickly and at the best price.

This article has been written as a part of a research project undertaken by one of our certified consultants. All information and content has been derived from secondary research and insights gained from recent projects at ZENeSYS for the PE industry. Credits have been provided in the references for text and data. ZENeSYS provides Market Research and Market Intelligence services to Private Equity Industry on a regular basis.


  5. Wharton PE Report 2010
  7. A Closer look at “Dodd-Frank Wall Street Reform and Consumer Protection act – Impact on Advisers to PE Funds” – a report by PwC May 2011

Tips for Quality in Data Analytics Projects

July 18, 2011 Leave a comment


If you are thinking of launching a data analytics project at the departmental level. Following best practices in gathering, preparing and loading data into your analytics system will be helpful. This white paper was written by ZENeSYS for iQCodex, the maker of leading data collection and transformation tool – iQCodex. Visit for more information.

What is Data Analytics?
Companies are eager to harness the sea of information around them for deep insights. For example, with the recent standardization of medical health records, health care providers now have access to historical data on chronic illness from a variety of sources, which can be helpful for better diagnosis. The Internet, business databases, news feeds partners, clients and internal company online data availability means access to market conditions in near real time. Because of this, companies are feeling the pressure to process information for operational efficiency and competitive advantage using Data Analytics – the science of analyzing data to develop insights.

Data Quality is Key
A significant job in data analytics task is in data collection, scrubbing and validation or better known as ETL (extraction, transformation and load). According to some studies between 40% and 70% of the cost in a data analytics ends up being spent in extraction, transformation and loading functions.

Business Units Take on Data Analytics Projects
With the arrival of affordable and easy to use ETL tools, business units rather than IT departments are taking on data analytics initiatives. This is understandable because the input and output of a data analytics initiative are the business units themselves. Input being raw data, which only the business units know where to find and output being insights, which again, only business units are capable of defining. If your business unit is considering a data analytics project then it is important to understand the best practices involved in capturing, transforming, validating and updating data for analysis.

Best Practices for Ensuring Data Quality
Extraction, Transformation and Loading (ETL) in simple terms means the job of (1) Collecting or capturing data (2) Ensuring Accuracy (3) Conversion for relevance and homogeneity and (4) Updating for keeping it fresh.

We will explain each of these in more detail now.

Collection of Data: Data will inevitably come from a variety of sources and we can be rest assured that the format will vary from one source to another.If it comes from a structured source such as a database, then it will be in a database language such as SQL. This is good news as structured data are easiest to deal with.

However the bad news is, structured data from external sources such as medical records or financial transactions have a different twist to it. These are likely to be in XML file formats – an industry specific data exchange language. These XML standards tend to vary for each industry e.g. insurance, finance or health.

Non-structured data can come from text, PDF, Word, or Excel files, though in most cases it will be Excel or a comma separated file also known as CSV files. This type of data generally comes from customers, news, industry reports, and internal project documents. Tapping into this category of data is often underestimated. By some estimates, up to 40% to 70% of data in analytics projects comes from MS Excel, MS Access or some other form of desktop or legacy application.
The important thing to note here is that you have to make sure the ETL tool you use is able to handle all your required input formats with equal ease.

Accuracy of Data: The foremost thing in ensuring accuracy is avoiding manual data entry. This means an ETL tool should be able to “read” all data sources electronically. For this to happen the ETL interface should be able to handle all the formats described in the previous section i.e. SQL, XML, Excel, Access, and plain text CSV files.

Electronic reading not only assures accuracy, but it reduces cost, speeds up data collection and most importantly, safeguards against errors and fraud – something that is becoming increasingly important as the regulators tighten the laws.
A good ETL tool will also provide a means of cross checking data for errors. For example, there should be validation checks for date formats, phone numbers, postcodes. Some ETL tools can provide look-up tables and programming capability to cross check for data accuracy.

For example, an insurance claim record may contain claim date for an incident that occurred prior to coverage date. So here a date format check will simply not be enough, we need a way to “validate” the incident date by fetching the policy validity dates before the claim data is allowed to go through.

Data Conversion: Conversion of data is probably the most complex feature in an ETL tool. We will go through several aspects of data conversion requirements. Sometimes data is not in the format we need. A simple example would be the need to convert currency. ETL tools should have the means to covert to the desired alternative units on the fly.

The second type of conversion requirement is semantic normalization. Consider a name that is being referred to in multiple ways e.g. a student, an apprentice, a trainee, or an observer. If they all mean the same thing for your analysis then it has to be normalized into a single preferred label before it enters your analytics engine.

Another type of conversion is a bit more complex than the date conversion example. What if we need to grab only “net of tax sales figures” where only gross sales figures are available at source? Your ETL tool should be able to compute the “net sales” from gross sales numbers by deducting the prevailing tax rates for that region as the source data is captured. This is a capability for allowing a “computed value” as input.

A fourth type of data conversion is the need for “protocol conversion”. With the advent of standardization, interchange of data between organizations is now being facilitated with communications standards (also called protocols) such as SWIFT in the financial services industry, HL7 for health records, ACCORD for insurance industry. A good ETL tool must provide for the necessary “hooks” to translate these standards or protocols and render the data to a format that is ready for analysis in your analytics engine.

The fifth type of conversion challenge is that of handling ad-hoc formats. Even today there are many organizations that have “system islands” meaning different systems are running different computer applications with no connection between them. The challenge here is that a data analytics project may need to tap into data from one or many disjointed system such as human resources, accounts payable, inventory management, or a customer relationship management. Some of these systems may be old with proprietary data formats. In such cases an ETL tool must be able to read comma-separated files – something that older systems are capable of generating. However, conversion from such file formats will never go according to plan.

When an error is encountered, an ETL tool needs to “flag” an exception. An administrator of the ETL tool will then examine the failed conversion step and make the conversion manually. Sophisticated ETL tools will “learn” these exception conditions so the same type of encounter again will not result in a manual intervention request.

A duplicate data is a very common exception condition. Here the same piece of data may appear more than once and quite likely with conflicting parts to it. For example, an accounts payable system has a customer bank account number as XYZ and a purchase order by the same customer shows their bank account number as ABC. In such cases the ETL tool must de-duplicate or in case of a conflict establish the correct bank account number.

Stripping of non-relevant data is another exception condition commonly encountered. This can and will happen when it is coming from legacy systems or an external organization. Take for example a data analytics exercise on eye related problems in diabetic patients. The data sources on patient history will likely contain many other data points apart from eye history. The ETL tool in this case has to systematically filter in only eye related data from diabetic patient records from the incoming source.

Updating of Data: Data Analytics is not a one-time effort. It is an ongoing talk that requires data refreshing on a continuous basis. However, to update data, an ETL tool needs to do more than just fetching new data.

Synchronizing the sequence of data updates: An address change update needs to be done prior to applying new eligibility rates by region in a particular customer record for instance. Each update action should have a way to take into account the dependency of another update or the planned time of update.

Timing of data updates: Even though certain data may be available immediately, the actual update needs to occur at a specific time to ensure no loss of data integrity. This happens often when dealing with legacy or older systems where processing takes place at night and the required data can only be extracted after 5 am. Here the ETL tool needs to ensure that the updates that are dependent on that nightly batch of data must start only after 5 am.

Partial or selective updates: Ad hoc sources of data such as external systems and personal data files are not capable of providing “selective updates”. Meaning they will contain the full set of data records each time. So if you are importing the credit history of your customers from an external source, each time you want an update, you will have to deal with the entire data set – even though you are only interested in those that have changed since your last upload. Sometimes there will be a flag or an indicator on which items have changed since last batch and but most likely not. It is up to the ETL system now to detect those flags and only update the changed records. In case there are no change indicator flags, the ETL tool needs to first determine which records have changed. Doing a full load for each update is not a good idea because it can slow down your Data Analytics server significantly. Another drawback is that you may end up with data that you no longer need.

Changes in transformation logic: It is quite apparent that the entire ETL process is a series of algorithms and business rules for enabling the capture, transformation and loading of data. Over a course of time, these algorithm or logic will need to be modified because of changes at source, discovery of a better way of doing things or change in policy. An ETL tool should be able to allow administrator to do this easily. To modify the formula or logic, the administrator should first be able to understand how it is setup currently and then give them the ability to edit it. This is actually a very important consideration to keep in mind while selecting an ETL tool. Those that do not provide this feature of a transparent setup will become totally unmanageable at some point as people forget how the logic was setup long time ago.

Business process changes: Lets say today your Data Analytics system is setup to import credit ratings directly from a rating agency. Tomorrow your accounts department wants to take on the role of approving the ratings because you have access to multiple rating agency data. Some ETL systems are workflow enabled to accommodate such powerful features of building in a approval processes. iQRoute from iQCodex excels in this features with many different workflow features such as escalating, authorizing, approving, and overriding.

Keeping an audit trail: With the advent of new laws such as Sarbanes Oxley in the US and Solvency II in EU, the need for proper audit trails is becoming vital. Notwithstanding, the impending regulatory pressure on compliance, it is a good practice to have an audit trail feature so if things go wrong, the system can pin point what was changed, who made the changes, and when the change was done.

Ability to track back on changes: Often things go wrong and mistakes are discovered after days or weeks. In such instances the audit trail feature will indicate the source of the problem. However to get back on our feet again, a roll back to the former condition becomes a huge lifesaver. A good ETL system will provide snapshots by date and time stamps of changes made. An administrator then can review each historical change and decide how far to go back to start all over again.

Several organizations are launching Data Analytics initiatives for getting invaluable market and operational insights at the departmental level. However, data analytics outcomes are dependent upon the quality of data. Several affordable tools are available on the market to provide ETL (extract, transform and load) capability. Before you decide to launch a data analytics project, consult this checklist to ensure you select an ETL tool that has all these features. These features are essential for quality of data and therefore the validity of insights derived.

Our own Competitive Analysis

March 8, 2011 Leave a comment

If you are in need of custom market research, competitive analysis, or data analytics, here are your options:

  1. ZENeSYS – A virtual, scalable network of certified management consulting resources
  2. Guru/Elance – A host of other portals for engaging freelance consultants
  3. Big Firms – The big name firms like Gartner, IDC, Yankee, Forrester, & Big 4
  4. Client – You could do it yourself

So how do we stack up?

While we are relatively unknown, our unique model of using a network of our own certified consultants has two powerful advantages:

  1. Virtual Consultants meaning lower cost, faster engagements, diverse expertise
  2. Assurance of quality from ZENeSYS core team

Our consulting certification process is a result of developing unique course ware and assessment methodology for over three years. The result – all our past clients are willing to act as our references.

A unique strategy by a Digital Media PE fund

February 12, 2011 Leave a comment

Buyout PE funds are at an all time high since year 2005. Despite this, the amount of “dry powder” or unallocated committed capital in the buyout funds is above 50% for the last three years according to AARMCORP who track and benchmark PE funds . Too much money chasing not-so-plenty a deals.

Furthermore, a peer set comparison of buyout funds shows that funds under 500M in assets who are investing in Technology and Communications buyouts have reported less than 10% IRR in the last three years. Despite these two negative indicators, there is unabated interest in technology buyout funds due to the promise of high returns in a relatively short term.

The good news is that it can still happen. Provided, there is a capable management team with the right connections, a smart deal identification mechanism, a creative makeover process, and secured exit planning. Enter Michael Connolly and David Silver of Atlas Digital Media Opportunity Fund who have what it takes to take on the challenge in delivering the heady 40% IRR limited partners expect.

Their highly successful deals EldercareLink, BuyerZone, Big City Doctors, Med Trak Alert and Atehena East, has given them a rich ecosystem of deal selection expertise. They have already identified two Internet properties with interested buyers as suitable target firms. Zane Tarrence, an adviser to the Fund, maintains a large database of potential buyers and together with the Managing Partners who have relationships with both strategic and financial buyers; for example, QuinStreet, Internet Brands, The Health Central Network, and BankRate among many others. In all cases their criteria is to seek out fragmented private companies at a discount to their multiple much higher than what other private or financial buyers would pay.

Unlike other funds, Atlas Digital will examine potential investments with an emphasis on who the potential buyers of the portfolio company will likely be prior to the Fund acquiring a controlling interest so there is no delays in cashing out in a timely manner. The partners management consulting backgrounds has been instrumental in creating valuable “reusable know-how” from re-restructuring Internet property deals such as ElderCare Link and Big City Doctors.

Applying tried and tested methodologies to fix Internet properties gives them an edge over others who are failing to derive efficiencies in their asset management operations. This is a significant edge for Atlas over other funds of similar size. Not only are they able to make more deals but they can also manage them with a lower operating cost.

There are hundreds of Internet businesses with a value just below the transaction size threshold of what strategic buyers are looking to buy, leaving an opportunity for Atlas Digital to identify, acquire, restructure and exit by leveraging their connections, methodologies and pre-appointed exit deals. Email for more information.

Both, Atlas Digital Partners and AARMCORP are past clients of ZENeSYS.

Custom secondary research is affordable and better than primary research

January 26, 2011 Leave a comment

According to Wikipedia, as of March 2009, the indexable web contains at least 25.21 billion pages. Google claims there are one trillion unique URLs.

Can you imagine the amount of information nuggets in web pages, blog posts, news articles, documents, discussions and social network chatter out there? Secondary research can tap into all that gold mine of information and make businesses rethink that “reliable market research only comes from primary research”.

Here are the drawbacks of Primary Research

  1. Sample size is never big enough
  2. Cannot guarantee that respondents are giving an honest answer
  3. Respondents may not be qualified or have the right exposure to provide answers

For the same drawbacks here are the benefits of Secondary Research on the net

  1. Sample size is practically limitless
  2. Answers can be cross verified e.g. homes with two car garages in a suburban area is a more accurate indicator of mean income than asking 100 respondents from that area about their household income
  3. When Apple wanted to determine the market size for iPhone I don’t suppose they asked the phone users if they would buy one.

Economizing Data Analytics – Private Equity Portfolio Optimization Case Study

January 24, 2011 2 comments

This was sent out in a recent ZENeSYS newsletter…

Up until now sophisticated analysis of data was the privilege of large institutional fund managers. Now technology and the Internet has leveled the playing field. Here is a case study on how this is taking place.

ZENeSYS has teamed up with AARMCORP to develop a platform that investors in Private Equity funds can use. Our first solution will be an implementation of Mean Variance Optimization algorithm on a PE fund  portfolio.

Balancing Act

This algorithm will allow investors to test their portfolios for maximizing returns. By making a peer group comparison of each new potential investment, the algorithm will make a call on whether to go ahead with that investment or not.

The determination will be made on whether the new fund is in the top percentile of its peers in terms of historical returns and whether it diversifies the risk in the target portfolio or not.

Fund managers will realize that this is nothing but an implementation of Markowitz’s theory of portfolio optimization. However, to be able to find all peer groups, make a comparison and test the diversity is not a trivial task even after assuming that all the comparison data were available to start with!

So how can it be done in an affordable manner? The answer lies in a shared platform that can be accessed over the Internet to run this algorithm using AARMCORP’s exhaustive database of PE fund performance data.

The sheer scalability of this solution reduces the cost of running each test for the price of a standard analyst report.

Image credits to Pete Ellis