The 2015 edition of the World Bank’s widely distributed Doing Business report received a lot of media attention. However, it did not tackle the serious criticisms made by the World Bank’s Independent Evaluation Group (IEG), civil society organisations (CSOs) Independent Panel
Created by the World Bank in 2002, Doing Business ranks the business climate of 189 countries on the basis of 10 indicators.
Three main changes to Doing Business 2015 fail to address fundamental flaws
1. Rankings: Distance to frontier to give a clearer picture of reality
The most notable change in the 2015 report is the way rankings are calculated. Rankings were first introduced in 2005 to make the report more influential among policy-makers. The Bank’s Chief Economist Kaushik Basu argues, in the foreword of the report, that rankings make “possible meaningful international comparisons of the regulatory performance of economies, contributing, along the way, to increasing the accountability of political actors”.
However, after criticism by the Independent PanelThis reform does not answer the criticism expressed by the Independent Panel or civil society organisations, which had called for the removal of the aggregate rankings.
The Independent Panel outlined several problems with the rankings system in its report.
First, it explained that they were an “arbitrary method of summarising vast amounts of complex information as a single number. Changing the weight accorded to a particular indicator can easily change an item’s ranking”. Currently, the Bank uses a subjective way of weighting the 10 different indicators’ scores to produce the aggregate ranking. The ranking itself is merely the countries put in order, according to the Bank’s subjective scores. It is these final results that attract media attention, with little consideration for the information’s subjectivity.
Second, in line with the IEG’s general criticism of the Bank’s investment climate work, Doing Business indicators do not consider the social or economic benefits of regulation, making the rankings a dubious measure in terms of the Bank’s goals of eradicating poverty and promoting shared prosperity. The Bank defines Doing Business as an attempt to measure how governments provide a positive business environment. According to the Bank, governments should fight against “excessively burdensome regulations [which] can lead to large informal and less-productive sectors, less entrepreneurship and lower rates of employment growth”. However, as many experts have pointed out, not all the indicators are self-evidently linked to higher rates of economic growth. For example, the US Economic Policy Institute showed in a study that low corporate tax rates do not necessarily increase the rate of economic growth. Taking this into consideration, the level of corporate taxation in the ‘paying taxes’ indictor is misleading.
Furthermore, there is no empirical evidence that the rankings are linked to poverty impact or even to economic growth, which is the core attack from emerging markets on the report. For example, many of the world’s fastest growing economies come very low down the list – China is at number 90, India is at 142 – while poor economic performance can happily coincide with excellent Doing Business scores. For example, Macedonia is at number 30 and South Africa is at number 43. In that sense, Doing Business rankings push for de-regulatory reforms without clear evidence that these are important for achieving the Bank’s goals. This situation led some emerging countries to attack the report, leading to its review. Their critiques are summarised in the Independent Panel report.
Third, rankings tend to push countries to manipulate the indicators to improve their rankings instead of trying to improve the reality that the indicators try to capture. In 2012, Russia’s President Vladimir Putin ordered the government to improve Russia’s Doing Business ranking from 120th in 2011 to 50th by 2015 and 20th in 2018. Although the 2015 target was missed, Russia witnessed a notable improvement, as it is currently ranked at 62. Analysis by the Russian daily newspaper Kommersant, translated by Worldcrunch, however, reveals that this improvement is based on cosmetic reforms and not on a real improvement in the business environment in Russia. The ease with which results can be manipulated and the way that this tends to shift the focus of governments away from reforms with real impact on development is a major concern.
Eurodad’s main concern is that they push countries to dedicate important efforts to improve their rankings, despite the fact that these reforms may not bring any benefits to their poorest population, and may divert attention and efforts from other more important reforms. In so far as the rankings promote controversial policies, they can even be harmful to the poor. Given that the rankings are highly subjective and are not statistically sound, Eurodad is among the many voices that are calling for them to be abandoned. Furthermore, as a benchmarking exercise, Doing Business gives the idea that business environments should be inspired by ‘best’ practices. This conception of a policy reform agenda denies the fact that every country lives in a specific and complex context, which should be the starting point in determining the kind of reforms that are needed.
Previously, the Doing Business data was collected on the basis of a standardised scenario of the business regulations faced by a fictitious firm that has 60 employees, based in the country’s largest business city, with exports worth more than 10% of its sales (among other characteristics). This year, the second largest city was included in the data sample of the 11 countries with more than 100 million inhabitants.
The Bank claims that this reform is an answer to the Independent Panel report and its methodological recommendations. The Panel criticised the fact that using only one city “has the potential to create a distorted picture in larger countries and those with a federal system”. However, the data sample still ignores rural areas where many small- and medium-sized enterprises (SMEs) are located in the developing world.
The Bank has ignored other important critiques regarding its methodological approach to data collection, in particular, the use of law firms as the main source of data. This approach implies that data may be disconnected from reality on the ground. Concrete application of laws and regulations, as well as corruption, are absent from the report. This can push governments to focus on the reform of their regulatory framework with little regard for its concrete implementation. The Doing Business team tried to solve this by expanding the data collected for several of its indicators (see below) but, as pointed out by the Independent Panel, “there are inherent limits to what their methodology can achieve”.
In addition, the Independent Panel emphasises the ‘one size fits all’ approach of the report’s methodology, focusing on “whether one specific rule does or does not exist in different countries. This effectively disregards other legal solutions that achieve the same goal.”
3. Changes within the indicators
Doing Business expanded the data collected for three out of its 11 indicators (the 2016 edition will include an expansion of five other indicators). The objective of this expansion is to improve the assessment of the quality of the regulations. This year, the report introduces more features on the strength of legal rights and depth of credit information and on minority shareholders’ rights. It also introduces a measure of the strength of the legal framework for insolvency.
In general, this reform fails to solve most of the problems with the indicators. As pointed out by the Independent Panel, there is no “scientific evidence to support the report’s current selection of indicators”. The report fails to explain how the selection of indicators is relevant to the Bank’s goals of eradicating poverty and promoting shared prosperity. It seems that the main rationale behind the selection of criteria is the idea that regulation is always ‘red tape’ that prevents business development, job creation, economic growth and ultimately poverty eradication.
In addition, some indicators need additional information in order to give a relevant assessment of the area they are covering. From that perspective, the extension of the qualitative aspects of the indicators is a step in the right direction. For example, the inclusion of the reliability of supply in the getting electricity indicator will give a clearer picture of the reality. The same kind of reform should have been designed for the access to credit indicator, which does not include any measure of the availability of credit for firms.
This misleading indicator drove the Zambian government to undertake a reform programme that improved its access to credit ranking. According to a paper produced by the Jesuit Centre for Theological Reflection (JCTR) and the Catholic aid agency CAFOD, this programme had little impact on local micro and small enterprises. These smaller businesses are largely excluded from the reforms or are lacking the information about how to benefit from them. Such reforms are an example of an intervention designed to improve a country’s ranking and to attract Foreign Direct Investments over prioritising the real needs of local and smaller businesses.
Two other indicators are particularly problematic. Despite data from the labour market indicator not affecting a country’s overall ranking, this information is still collected and included on the Doing Business website. This indicator continues to see labour market regulations in terms of cost rather than benefits. It has been widely criticised by many actors, including the International Trade Union Confederation (ITUC).
Finally, the paying tax indicator continues to reward lower corporate tax rates, although not automatically. This ‘race to the bottom’ has negative effects on development, as outlined by a recent International Monetary Fund (IMF) paper.
This off-target reform fails to make the report relevant for development
Doing Business 2015 comes at the end of an extensive review process. However, it does little to address the major concerns that were raised during that process. The result is a highly publicised report that tells us very little, has limited relevance to poverty alleviation, and may end up promoting the wrong reforms. If the report continues in its current manifestation CSOs, including Eurodad, will continue to oppose this misguided exercise.
 There are 11 indicators but ‘labour market’ is not used in the rankings calculation.
Aggregate ranking for all 189 economies is calculated as the simple average of individual percentile rankings on each of the 10 topics included in the project.
 2.1 average economic growth between 2010 and 2013.
 2.7% average economic growth between 2010 and 2013.
 Bangladesh, Brazil, China, India, Indonesia, Japan, Mexico, Nigeria, Pakistan, the Russian Federation and the United States.
 Currently ranked 23rd.