Author Archive

CoolChip Technology Benefits from ZENeSYS Competitive Landscape Analysis Methodology

September 23, 2011 Leave a comment

On August 20th and 21st, 32 future management consultants from top Business Schools in India participated in a ZENeSYS Boot Camp. The objective of the  boot camp was to learn ZENeSYS Competitive Landscape Methodology and write the best analysis for CoolChip Technologies.

Winner of MIT clean energy award and recipient of a DARPA grant, CoolChip Technologies has perfected its patented technology for rapid heat removal from high performance computer chips. They are now ready for its next level of growth.

The participants in the Boot Camp were tasked with using the ZENeSYS Competitive Landscape Analysis (CLA) methodology to provide an unbiased assessment of CoolChip’s product features as compared to its competitors. Three teams from Boot Camp submitted their analysis in the final round which prompted the following response from William Sanchez CEO of CoolChip Technologies.

“We had the opportunity to benefit from their novel methodology for competitive landscaping. The ZENeSYS analysis will be very useful for CoolChip’s next stage of growth as we seek our next round of financing. In-depth understanding of the competitive landscape is curial for positioning our company. The ZENeSYS systematic framework provides a clear, concise mulch-dimensional presentation of the competitive landscape. The tool is a priceless piece of machinery for processing unwieldy amounts of information. The rapid report generation and high standard of quality make the market intelligence very valuable and a resource CoolChip will continue to use as we explore new market verticals and require comprehensive analysis on incumbents and new entrants.”

About ZENeSYS Competitive Landscape Analysis (CLA): A methodology that has been perfected over two years to get the most up-to-date snapshot of market conditions as it relates to a client’s specific product or service. The methodology uses a framework based on competitor goals and customer’s preferences to synthesize news, blogs, forums, white papers, patents, press releases, testimonials, product literature, industry reports, books, and journals to identify best practices and opportunity areas. Startups develop insights for growth and investors get an unbiased assessment of investment risk.


A Creative Market Entry Strategy for CoolChip Technologies

August 15, 2011 Leave a comment

In a first of its series, ZENeSYS is launching a management consulting training boot camp and a competition to go along with it. CoolChip Technologies, the MIT startup and winner of MIT Clean Energy award is the client sponsor.

The training is directed at young aspiring management consulting professionals in India who are already at business schools or are looking to make the transition. In a week long run up to the first day at the boot camp, participants will learn consulting tools and methodologies available to them online at our website.

On day one of the boot camp, the participants will hit the ground running with a reference market entry case and guidance from instructors at ZENeSYS. On day two, they will develop a market strategy for CoolChip Technologies using our Competitive Landscape Analysis methodology, a powerful and proven technique for quick analysis of opportunities and threats for new products.

The CEO of CoolChip Technologies, Mr. William Sanchez will be present in the Boot Camp to interact with the six teams over Skype Video to answer questions and judge the winning team. The boot camp will be held in New Delhi on August 20th and 21st. Participants from leading Business Schools such as Delhi University, FMS, IIT-DMS, SJMSOM and IIM-R are registered to participate.

“I am truly exited about this boot camp and am looking forward to the creative ideas that will emerge from the entries” said William Sanchez last week at MIT when I met him.

If your firm has an interesting product or service for the emerging markets and would like to be considered for the next Boot Camp, please send me an email at

10 reasons why Consulting Remains the Top Choice for MBAs

August 1, 2011 Leave a comment

Although finance remains the number one recruiter at around 30% for placements at top business schools, consulting is not far behind at a consistent 20% on average. If one takes into consideration that there are limited number of vacancies in consulting as compared to finance, this gap could be even narrower.

So what could be the reasons why business school graduates like to enter the management consulting profession?

We offer 10 good reasons:

  1. Hooked on solving cases: The case based teaching gets them hooked. As they enter a business school, b-schoolers will find a different method of learning and working. The case based learning provokes thinking, gets them into researching facts, analyzing their findings, and synthesizing them into a big picture for deriving creative solutions. They never thought they could come up with solutions like this before and now they are hooked. Management Consultants tackle their engagements just like a business school case. So it’s quite understandable that if they are enjoying their casework at business school, they want more of it.
  2. The creative and the non-conformists: Chances are that this is what drew them into a business school to start with. They got tired of doing the same thing and listening to their boss for every day tasks. They want to do things their own way and not follow directions or bury themselves in a process. The consulting profession is the only profession that calls for maximum room for creativity. There is no micro-management by the boss other than the “dreaded deadline”. They are given the brief on their assignment and then off they go.
  3. It is a gateway to interesting work: Management consultants get exposed to interesting work right away. Examples include Mergers & Acquisitions, Investment Banking, Venture Capital, Private Equity, Corporate Strategy, Transformation and Research. Understandably, this will be only true for those entering the top tier management-consulting firms but candidates can still aspire to build their credibility in sideline projects in smaller or lesser known consulting firms.
  4. Provides exposure to sunrise sectors: By definition management consulting projects gravitate towards sectors that are experiencing a paradigm shift. Currently this is happening in Energy, Health and Information Technology. With the interest in environmental issues, energy crisis and alternative energy interest, consultants are in high demand to provide research, direction and transformational advice. The health sector is experiencing cost and regulatory pressure which is inviting thought leadership from the consulting industry on making necessary business process changes, effective use of technology and call for application in emerging science such as biotechnology.
  5. Consultants play in many sandboxes: Most likely an MBA graduate would not want to get pigeonholed into one company’s vision, technology and ways of doing business. They would want to play in many different sandboxes, right? Well that is what consultants get to do. Each project is different because each situation is different in the consulting world. They will get into different cultures, different technologies, different markets, different infrastructure and even different size budgets each time. This makes it challenging, interesting and a valuable learning experience for them.
  6. Puts an MBA in the “board room”: If not literally, as management consultants, candidates are presented with finding solutions to the toughest issues a company is facing today. This means that their work will have visibility all the way to the CXO level – something not possible in any other profession – at least, not when they are starting out their careers.
  7. Preparation for going their own way: Consulting profession prepares candidates well for going into business for themselves or entering a startup. This is because only in management consulting one can quickly develop skills for tough decisions, tight deadlines, getting the numbers to work, selling, purchasing, motivating, politics, influencing, handling conflicts and communicating effectively. Learning all these skills is possible in other jobs but will take twice as long.
  8. A stepping-stone before deciding on a long-term path: When starting out, young professionals may not be sure of their final calling. For example, do they want to be in a leadership role and rise up to become a CEO? Do they like marketing or sales? Do they prefer to stay in engineering, technology or R&D? Do they like operations role such as customer service, purchasing, accounting, or HR? If they are not sure then they can enter management consulting and experience all fronts and decide later where they want to go. Whichever line they pick, they will surely have ample experience and confidence to fast track to the top when they make up their minds.
  9. Builds their personal network: Consulting is a profession that is guaranteed to make ones LinkedIn connections reach the 500+ mark faster than any other career. Moreover, it will give them a far richer and diverse set of connections as opposed to any other profession. As we all know in these days of social media, career progression is very much tied to the relationships one will build.
  10. The need to get a faster return on investment: Last but not the least, consulting happens to be a well-paid profession so chances of repaying that expensive business school loan are quicker compared to other jobs. No one objects to a ROI that is sooner rather than later.

On an ending note, it would be improper to portray management consulting as a flawless career choice. Consulting is an intense profession with long hours and frequent travel. More than 50% who enter will exit this profession within three to four years. On the brighter side, no ex-consulting professional will deny that it was an exhilarating experience and that it helped them tremendously in the formative days of their career.

ZENeSYS provides live case based certification training program to business school students and other aspirants in Management Consulting. See our training website for more details.

The Great Liquidity Crunch – Private Equity Firms Re-invent Themselves

July 31, 2011 Leave a comment

By Vaibhav Gupta – ZENeSYS Certified Consultant, IIM Indore (2010-12 Batch). Vaibhav completed his B.Tech. in Mechanical Engineering from Delhi College of Engineering. Before his MBA, Vaibhav worked in Project Management for 4 years with Siemens Energy. Vaibhav intends to join the Management Consulting industry after his MBA.

Abstract: The liquidity crunch, tighter regulations and more control demanded by general partners is forcing the PE industry to re-invent itself.

The three key areas of transformation are:

  1.   More secondary deals – establishing the best valuations and finding the right buyers
  2.   Restructuring deals – creative ways of re-negotiating (past) unrealistic deals
  3.   Employing turnaround specialists – creating consultancy like capability for operational improvements

Over the last decade, uncharacteristic of what one might think, the private equity industry has gone through an economic cycle just like the stock market does – up and down. The up phase of the PE market started around 2003 and it continued to boom till 2007-2008 or just when the financial crisis kicked in.

During the up phase, which started after the telecom burst in 2002, the PE industry saw the largest leveraged buyout (LBOs) taking place in history. In the year 2006, PE firms had bought more than 650 US companies for a figure of $375 Million, which was around 18 times the transactions, closed in 2003 [ref 1] alone.

The LBO movement was fueled by decreasing interest rates, low lending standards and tight regulatory framework of owning a public enterprise (Sarbanes Oxley Act) [ref 2]. Consequently, it is one of the reasons why venture capitalists started relying on sales to strategic buyers for an exit.

The arrival of financial crisis in 2008 and the liquidity crunch brought about a quick halt to the LBO frenzy. The leveraged finance market almost stood still during and after the financial crisis. Not only that but as a result of this crisis, many deals were withdrawn or had to be re-negotiated [ref 3].

PE industry is measured with help of two metrics – fund raising and investment activities. The fund raising refers to the money, investors have committed to PE funds in a year. That fund raising activity had fallen to $150 Billion globally in 2009 from $450 Billion in 2008. Coincidentally, the 2009 figure is the lowest since 2004. The lack of debt in the following year of 2010 meant no hope of any speedy recovery.

The other metric, investment activity, which represents the financing of businesses, had fallen from $181 Billion globally in 2008 to just over $90 Billion in 2009. It 2010, it picked up to $110 Billion [ref 5]. This minor jump could possibly be attributed to increased investments in Small & Medium Enterprises and emerging markets such as Brazil, China and India.

Since PE funds acquire firms so that they could be sold at a profit later, life becomes difficult for them during an economic slowdown. Especially more so, where there is a liquidity crunch factor in addition. This was apparent from the total value of PE exit transactions. They fell from $151 Billion in 2008 to $81 Billion in 2009.

From here on to the next five years till 2016, over $800 Billion in loans extended on committed deals would become due or will have to be refinanced [ref 6]. This $800B is distributed almost equally between bank loans and high-yield bonds. To add further complication in hiving off these assets the US government has passed a bill that would require any PE firm which has more $150 Million in assets under management to register with SEC.

The implications are public disclosure of risks, business activities names of the personnel involved, assets owned, amount owed to each creditor, performance metrics, debt and defaults [ref 7].

According to estimates [ref 9], there is one trillion dollars worth of dry powder in PE funds globally. However, LPs are now demanding more control and requesting more information about their investments. The LPs want to keep track of the draw-down capital so that GPs don’t overdraw their limits [ref 8]. This in effect has created a liquidity crunch of sorts, within the PE community itself.

This puts PE funds in a tight spot. How should they service their existing debts and acquire new assets? The solution for servicing debts is to look for different options such as secondary markets, restructuring the deal or employ turn-around specialists to improve valuations. For acquiring new assets, they must look harder in the marketplace or find greener pastures in emerging markets.

Hence, turn-around specialists are in demand nowadays. PE firms are turning away from traditional leveraged deals and looking into investing in distressed companies. They feel it is better to restructure deals based on a change in strategy rather than to take money out and pursue matter in courts.

Turn-around is becoming more and more important as top lines (revenues) are shrinking. Even vendors have stopped extending favorable credit terms. Hence the success of any PE acquisition is down to operational excellence. This means improved management, optimizing expenditures, and rooting out inefficiencies such as overcapacity created in high growth years.

Interestingly, PE firms have identified a new gap in the market – companies, which do not have the means to hire expensive management consultants, are now finding this as a welcome opportunity to bring aboard high quality leadership.

Emerging markets such as Brazil, China and India are still attractive and PE firms need to find ways of entering them. China and India are the two fastest growing economies even during the recession and they need to develop infrastructure to support the high growth. Brazil is set to hold 2014 World Cup and 2016 Olympics.

With increasing pressure from the regulators, lack of liquidity, and tighter control demanded by general partners, future PE deals would need to be financed and executed with better insights and strategic planning. This means deeper research before deals are struck, awareness of best practices in target markets for operational excellence and market intelligence for finding the right buyers quickly and at the best price.

This article has been written as a part of a research project undertaken by one of our certified consultants. All information and content has been derived from secondary research and insights gained from recent projects at ZENeSYS for the PE industry. Credits have been provided in the references for text and data. ZENeSYS provides Market Research and Market Intelligence services to Private Equity Industry on a regular basis.


  5. Wharton PE Report 2010
  7. A Closer look at “Dodd-Frank Wall Street Reform and Consumer Protection act – Impact on Advisers to PE Funds” – a report by PwC May 2011

Tips for Quality in Data Analytics Projects

July 18, 2011 Leave a comment


If you are thinking of launching a data analytics project at the departmental level. Following best practices in gathering, preparing and loading data into your analytics system will be helpful. This white paper was written by ZENeSYS for iQCodex, the maker of leading data collection and transformation tool – iQCodex. Visit for more information.

What is Data Analytics?
Companies are eager to harness the sea of information around them for deep insights. For example, with the recent standardization of medical health records, health care providers now have access to historical data on chronic illness from a variety of sources, which can be helpful for better diagnosis. The Internet, business databases, news feeds partners, clients and internal company online data availability means access to market conditions in near real time. Because of this, companies are feeling the pressure to process information for operational efficiency and competitive advantage using Data Analytics – the science of analyzing data to develop insights.

Data Quality is Key
A significant job in data analytics task is in data collection, scrubbing and validation or better known as ETL (extraction, transformation and load). According to some studies between 40% and 70% of the cost in a data analytics ends up being spent in extraction, transformation and loading functions.

Business Units Take on Data Analytics Projects
With the arrival of affordable and easy to use ETL tools, business units rather than IT departments are taking on data analytics initiatives. This is understandable because the input and output of a data analytics initiative are the business units themselves. Input being raw data, which only the business units know where to find and output being insights, which again, only business units are capable of defining. If your business unit is considering a data analytics project then it is important to understand the best practices involved in capturing, transforming, validating and updating data for analysis.

Best Practices for Ensuring Data Quality
Extraction, Transformation and Loading (ETL) in simple terms means the job of (1) Collecting or capturing data (2) Ensuring Accuracy (3) Conversion for relevance and homogeneity and (4) Updating for keeping it fresh.

We will explain each of these in more detail now.

Collection of Data: Data will inevitably come from a variety of sources and we can be rest assured that the format will vary from one source to another.If it comes from a structured source such as a database, then it will be in a database language such as SQL. This is good news as structured data are easiest to deal with.

However the bad news is, structured data from external sources such as medical records or financial transactions have a different twist to it. These are likely to be in XML file formats – an industry specific data exchange language. These XML standards tend to vary for each industry e.g. insurance, finance or health.

Non-structured data can come from text, PDF, Word, or Excel files, though in most cases it will be Excel or a comma separated file also known as CSV files. This type of data generally comes from customers, news, industry reports, and internal project documents. Tapping into this category of data is often underestimated. By some estimates, up to 40% to 70% of data in analytics projects comes from MS Excel, MS Access or some other form of desktop or legacy application.
The important thing to note here is that you have to make sure the ETL tool you use is able to handle all your required input formats with equal ease.

Accuracy of Data: The foremost thing in ensuring accuracy is avoiding manual data entry. This means an ETL tool should be able to “read” all data sources electronically. For this to happen the ETL interface should be able to handle all the formats described in the previous section i.e. SQL, XML, Excel, Access, and plain text CSV files.

Electronic reading not only assures accuracy, but it reduces cost, speeds up data collection and most importantly, safeguards against errors and fraud – something that is becoming increasingly important as the regulators tighten the laws.
A good ETL tool will also provide a means of cross checking data for errors. For example, there should be validation checks for date formats, phone numbers, postcodes. Some ETL tools can provide look-up tables and programming capability to cross check for data accuracy.

For example, an insurance claim record may contain claim date for an incident that occurred prior to coverage date. So here a date format check will simply not be enough, we need a way to “validate” the incident date by fetching the policy validity dates before the claim data is allowed to go through.

Data Conversion: Conversion of data is probably the most complex feature in an ETL tool. We will go through several aspects of data conversion requirements. Sometimes data is not in the format we need. A simple example would be the need to convert currency. ETL tools should have the means to covert to the desired alternative units on the fly.

The second type of conversion requirement is semantic normalization. Consider a name that is being referred to in multiple ways e.g. a student, an apprentice, a trainee, or an observer. If they all mean the same thing for your analysis then it has to be normalized into a single preferred label before it enters your analytics engine.

Another type of conversion is a bit more complex than the date conversion example. What if we need to grab only “net of tax sales figures” where only gross sales figures are available at source? Your ETL tool should be able to compute the “net sales” from gross sales numbers by deducting the prevailing tax rates for that region as the source data is captured. This is a capability for allowing a “computed value” as input.

A fourth type of data conversion is the need for “protocol conversion”. With the advent of standardization, interchange of data between organizations is now being facilitated with communications standards (also called protocols) such as SWIFT in the financial services industry, HL7 for health records, ACCORD for insurance industry. A good ETL tool must provide for the necessary “hooks” to translate these standards or protocols and render the data to a format that is ready for analysis in your analytics engine.

The fifth type of conversion challenge is that of handling ad-hoc formats. Even today there are many organizations that have “system islands” meaning different systems are running different computer applications with no connection between them. The challenge here is that a data analytics project may need to tap into data from one or many disjointed system such as human resources, accounts payable, inventory management, or a customer relationship management. Some of these systems may be old with proprietary data formats. In such cases an ETL tool must be able to read comma-separated files – something that older systems are capable of generating. However, conversion from such file formats will never go according to plan.

When an error is encountered, an ETL tool needs to “flag” an exception. An administrator of the ETL tool will then examine the failed conversion step and make the conversion manually. Sophisticated ETL tools will “learn” these exception conditions so the same type of encounter again will not result in a manual intervention request.

A duplicate data is a very common exception condition. Here the same piece of data may appear more than once and quite likely with conflicting parts to it. For example, an accounts payable system has a customer bank account number as XYZ and a purchase order by the same customer shows their bank account number as ABC. In such cases the ETL tool must de-duplicate or in case of a conflict establish the correct bank account number.

Stripping of non-relevant data is another exception condition commonly encountered. This can and will happen when it is coming from legacy systems or an external organization. Take for example a data analytics exercise on eye related problems in diabetic patients. The data sources on patient history will likely contain many other data points apart from eye history. The ETL tool in this case has to systematically filter in only eye related data from diabetic patient records from the incoming source.

Updating of Data: Data Analytics is not a one-time effort. It is an ongoing talk that requires data refreshing on a continuous basis. However, to update data, an ETL tool needs to do more than just fetching new data.

Synchronizing the sequence of data updates: An address change update needs to be done prior to applying new eligibility rates by region in a particular customer record for instance. Each update action should have a way to take into account the dependency of another update or the planned time of update.

Timing of data updates: Even though certain data may be available immediately, the actual update needs to occur at a specific time to ensure no loss of data integrity. This happens often when dealing with legacy or older systems where processing takes place at night and the required data can only be extracted after 5 am. Here the ETL tool needs to ensure that the updates that are dependent on that nightly batch of data must start only after 5 am.

Partial or selective updates: Ad hoc sources of data such as external systems and personal data files are not capable of providing “selective updates”. Meaning they will contain the full set of data records each time. So if you are importing the credit history of your customers from an external source, each time you want an update, you will have to deal with the entire data set – even though you are only interested in those that have changed since your last upload. Sometimes there will be a flag or an indicator on which items have changed since last batch and but most likely not. It is up to the ETL system now to detect those flags and only update the changed records. In case there are no change indicator flags, the ETL tool needs to first determine which records have changed. Doing a full load for each update is not a good idea because it can slow down your Data Analytics server significantly. Another drawback is that you may end up with data that you no longer need.

Changes in transformation logic: It is quite apparent that the entire ETL process is a series of algorithms and business rules for enabling the capture, transformation and loading of data. Over a course of time, these algorithm or logic will need to be modified because of changes at source, discovery of a better way of doing things or change in policy. An ETL tool should be able to allow administrator to do this easily. To modify the formula or logic, the administrator should first be able to understand how it is setup currently and then give them the ability to edit it. This is actually a very important consideration to keep in mind while selecting an ETL tool. Those that do not provide this feature of a transparent setup will become totally unmanageable at some point as people forget how the logic was setup long time ago.

Business process changes: Lets say today your Data Analytics system is setup to import credit ratings directly from a rating agency. Tomorrow your accounts department wants to take on the role of approving the ratings because you have access to multiple rating agency data. Some ETL systems are workflow enabled to accommodate such powerful features of building in a approval processes. iQRoute from iQCodex excels in this features with many different workflow features such as escalating, authorizing, approving, and overriding.

Keeping an audit trail: With the advent of new laws such as Sarbanes Oxley in the US and Solvency II in EU, the need for proper audit trails is becoming vital. Notwithstanding, the impending regulatory pressure on compliance, it is a good practice to have an audit trail feature so if things go wrong, the system can pin point what was changed, who made the changes, and when the change was done.

Ability to track back on changes: Often things go wrong and mistakes are discovered after days or weeks. In such instances the audit trail feature will indicate the source of the problem. However to get back on our feet again, a roll back to the former condition becomes a huge lifesaver. A good ETL system will provide snapshots by date and time stamps of changes made. An administrator then can review each historical change and decide how far to go back to start all over again.

Several organizations are launching Data Analytics initiatives for getting invaluable market and operational insights at the departmental level. However, data analytics outcomes are dependent upon the quality of data. Several affordable tools are available on the market to provide ETL (extract, transform and load) capability. Before you decide to launch a data analytics project, consult this checklist to ensure you select an ETL tool that has all these features. These features are essential for quality of data and therefore the validity of insights derived.

Learning and competing

As a fun way of learning consulting skills, we launched a “Prove your analytical skills” competition in end of March this year.

There were a series of six contests – each tied to one of our six management consulting training modules.

The six challenges:

  1. What is the difference between IT Consulting & Management Consulting? – Contestants were tested on their ability to observe and list the differences in a structured format.
  2. Consulting Knowledge Base – use or abuse? – For this we asked contestants to analyze the underlying ethical issues on the following FT article McKinsey model springs a leak
  3. Issue analysis using Initial Hypothesis – We told them that they are management consultants for Divestiture of Hero Group from Honda and they need to come up with a strategy using this tool.
  4. Creating Hybrid Frameworks – We asked our contestants to create a framework for how Apollo Group of Hospitals may enter Africa.
  5. Creating Recommendations using a Synthesis Matrix – We gave them an analytical snapshot of research data for launching a cosmetic product done in one of our earlier projects. Contestants were asked to develop a set of recommendations on the basis of facts.
  6. Write a SHORT Consulting Proposal – They were asked to write a mock proposal for University of California purchasing system. Their ability to use planning tools to write a proposal that was convincing for the client and addressed the threats from competition was tested.

We wish to thank all who participated and congratulations once again Luis Villegas for first place and Deepak Verma as runners up!

If you are a B-School student or a professional looking to learn consulting skills, join our network @

Categories: B-Schools, Learning

Ingredients of a White Paper – What goes around comes around

Thanks to white papers, a vast amount of knowledge is readily available on the Net today. White papers are instrumental on getting up to speed on concepts we are not familiar with when suddenly there is a need to become an “expert” overnight!

However, what goes around, will eventually come around. This means the more white papers industry contributes, the more will researchers, potential clients, and business will benefit from this ecosystem.

But what is the hallmark of a good white paper?

  1. Subject of Current Interest: It needs to be a topic that will catch attention of the potential client community who may be grappling with sudden changes in the industry such as a new technology, regulation or point of view
  2. Short and Focused: Should not be more than 4 to 5 pages long. This can be easily done by focusing on a very specific area of the issue. Generic or all-encompassing papers can become very lengthy, expensive to create and unlikely to be read by the target community
  3. Proving the Viewpoint: There should be empirical data (facts and figures) to support the insights in the paper. Even though most readers will not follow up on references, these should be provided to indicate that its not a subjective viewpoint
  4. The 80:20 Rule for Impartial Insight versus Plug-in: At least 80% of the paper should provide impartial insight on the topic. In all honesty, it should be fair to provide 80% “education value” in exchange for a maximum 20% “promotional content”. This of course, assumes that the promotional content legitimately fits into the solution  under discussion
  5. Interesting Reading: Should be engaging to read. Must have some charts and/or pictures to break the monotony of words. Pictures are worth a thousand words. In addition, they convey the sense that someone took the time to make it easy to understand and save time for the reader
  6. References must be Quoted: Quoting references not only attributes credit to the original contributors but provides a means of assessing the degree of content legitimacy for the reader