Gross exports accounting is a novel sub-area of research that seeks to allocate the value added in gross trade flows to its true country and sector of origin and country or sector of destination. Various frameworks have been recently proposed to perform such decompositions. This paper presents another effort to generalise the accounting framework so that it may be easily interpreted, customised and implemented in matrix computation software. The principal contribution is therefore a relatively simple way to derive the formulae for the decomposition of cumulative value added flows embodied in international trade. The underlying accounting approach is found to be largely similar to that of [Koopman et al., 2012; Stehrer, 2013], but the block matrix formulation allows the user to simultaneously decompose all bilateral flows at the country and/or sectoral level. The refined framework is applied to describe Russia’s export performance from the global value chain perspective using the data from the World Input-Output Database (WIOD) for 2000 and 2010. According to the findings, the countries that directly receive most of Russia’s exports are not exactly those that use most of Russia’s value added. Russia’s mining sector is found to be an intrinsic part of a complex downstream value chain where it indirectly contributes value to partner exports.
Recently emerged, auction theory has become a well-established branch of theoretical economics with important practical applications. As the theory progresses, its basic assumptions become the subject of further investigation and thus new directions emerge. Microeconomics in general and auction theory in particular too often assume away aftermarket interactions, which are a common feature of real markets and have a powerful impact on strategies and incentives. Lately, however, a body of literature emerged that incorporates the possibility of resale into game-theoretic analysis of auctions. This paper reviews this literature. It highlights the role of bargaining power on the aftermarket as one of the main issues in this literature. It then reports how standard auction formats – first and second price auctions in particular – compare in terms of the seller’s revenues they generate. Then, this paper shows generalizations of Myerson’s approach to constructing optimal auctions when resale is possible; the discussed models require delicate assumptions. Next, the survey covers more specific issues: different approaches to modelling the aftermarket, decisions to enter the primary auction, effects of disclosing information from the primary auction and the role of speculators. Finally, the paper overviews empirical research on auctions with resale, the newest branch of this literature that is developing rapidly.
There is a large consensus in the literature on the negative impact of corruption on economic growth and development. Due to its illegal and covered nature, the analysis of corruption is challenging to economists. The lack of data is the main barrier for researchers. Experimental approach allows to generate the required data and could become the most promising approach to study the determinants of corruption and to test the possible anti-corruption measures. In this paper we present the wide survey of the experimental evidence of corruption. In the first part we make the brief introduction into the experimental economics approach. The second part reviews in details the game-theoretic сorruption models. We find out that the lion’s share of these models has a standard design based on the simple two-agents’ interaction through bribery. In the third part we discuss the main findings encountered by the literature that use experimental techniques to study corruption. In particular, we analyse the cultural and gender difference in the propensities to engage into corruption behaviour, income and monitoring effects, factors that influence one's tolerance of corruption. Finally, in the last part we discuss advantages and disadvantages of the experimental approach for corruption studies.
We investigate the consequences of excessive international debt overhang as they relate to both debtor and creditor countries. In particular, we assess the impact of monetary policy on financial stability and how it can be used to smooth borrowers, as well as creditors, consumption over the business cycle. Based on [Goodhart, Peiris, Tsomocos, 2018], we establish that an independent countercyclical monetary policy, that contracts liquidity whenever debt grows whereas it expands it when default rises, reduces volatility of consumption. In effect, monetary policy provides an extra degree of freedom to the policymaker. We implement our approach to the Czech and Eurozone area economies during the 1990s. In our model, we introduce endogenous default ά la [Shubik, Wilson, 1977], whereby debtors incur a welfare cost in renegotiating their contractual debt obligations that is commensurate to the level of default. However, this cost depends explicitly on the business cycle and it should be countercyclical. Hence, contractionary monetary policy reduces the volume of trade and efficiency, thus increasing default. This occurs as the default cost increases the associated default accelerator channel engenders higher default rates. On the other hand, lower interest rates increase trade efficiency and, consequently, reduce the amplitude of the business cycle and benefit financial stability. In sum, the appropriate design of monetary policy complements financial stability policy. The modeling of endogenous default allows us to study the interaction of monetary and macroprudential policy.
We study the world largest credit risk losses from the year of 1972 when the predecessor of the Basel Committee on Banking Supervision was established. By choosing a round threshold of current USD 100m equivalent of loss amount and the entity total assets in excess of current USD 500m as of the loss announcement date, we collected the dataset of 56 cases with the total loss of the current USD 700bn (or ca. 900 constant 2018 USD bn) which occurred during the last half of a century. The two most unexpected findings are the following. First, we verified the announced loss amounts by analysis of stock quotes dynamics around the loss announcement dates. Thus we were able to trace three cases where announced by mass media losses may seem to have been exaggerated. Second, there is a series of events when there was a disclosure combination of credit risk loss and operational one. It seems the latter might have been used to partially cover the former.
This paper studies learning in strategic environment using experimental data from the Rock-Paper-Scissors game. In a repeated game framework, we explore the response of human subjects to uncertain behavior of strategically sophisticated opponent. We model this opponent as a robot who played a stationary strategy with superimposed noise varying across four experimental treatments. Using experimental data from 85 subjects playing against such a stationary robot for 100 periods, we show that humans can decode their strategies, on average outperforming the random response to such a robot by 17%. Further, we show that human ability to recognize such strategies decreases with exogenous noise in the behavior of the robot. Further, we fit learning data to classical Reinforcement Learning (RL) and Fictitious Play (FP) models and show that the classic action-based approach to learning is inferior to the strategy-based one. Unlike the previous papers in this field, e.g. Ioannou, Romero (2014), we extend and adapt the strategy-based learning techniques to the 3x3 game. We also show, using a combination of experimental and ex-post survey data, that human participants are better at learning separate components of an opponent's strategy than in recognizing this strategy as a whole. This decomposition offers them a shorter and more intuitive way to figure out their own best response. We build a strategic extension of the classical learning models accounting for these behavioral phenomena.
We consider standard monopolistic competition models in the spirit of Dixit and Stiglitz or Melitz with aggregate consumer's preferences defined by two well- known classes of utility functions – the implicitly defined Kimball utility function and the variable elasticity of substitution utility function. These two classes gene- ralize classical constant elasticity of substitution utility function and overcome its lack of flexibility. It is shown in [Dhingra, Morrow, 2012] that for the monopolis- tic competition model with aggregate consumer’s preferences defined by the va- riable elasticity of substitution utility function the laissez-faire equilibrium is effi- cient (i.e. coincides with social welfare state) only for the special case of constant elasticity of substitution utility function. We prove that the constant elasticity of substitution utility function is also the only one which leads to efficient laissez- faire equilibrium in the monopolistic competition model with aggregate consu- mer’s preferences defined by the utility function from the Kimball class. Our main result is following: we find that in both cases a special tax on firms' output may be introduced such that market equilibrium becomes socially efficient. In both cases this tax is calculated up to an arbitrary constant, and some considerations about the «most reasonable» value of this constant are presented.
We explore the 2007-2008 noticeable liberalization of Russian labor immigration, perceived as a natural experiment. How it influenced the labor market equilibrium and especially wages of certain categories of Russian employees? We use various data, including remittances from Russia, and restore related hike in official and unofficial labor immigration. According to our rough estimate, the mass of gastarbeiters increased significantly, from 3.4 mln in 2006 to 4.3 mln (2006) and then to 4.9 mln (2008). This did not cause additional unemployment, but influenced wages. We follow the Borjas (2016) method of assessing the impact of natural experiments, and we are interested in (equilibrium) wage elasticities and interdependencies among the labor groups in Russia. To reveal the elasticity of equilibrium wages, responding to 2007-2008 inflow of (mostly unskilled) labor, we run difference-in-difference regressions on RLMS data. For some Russian residents, their wages responded to new policy noticeably. Namely, the most affected were the pre-established Asian migrants: they lost about 14-17.5% wages in response to 8%-14% increase in similar working force. The ethnic Russians with blue-collar or low qualification lost about 4.5-5.5% of wages, while the impact on white-collars was insignificant. Arguing about the macro-economic consequences of such liberalization policies for Russia, we thereby point out the negatively affected categories of employees and the degree of their losses, which can be compared with additional GDP generated.
A general description in the form of an iterative procedure of methods implementing the Single Transferable Vote (STV) is given. Woodall's axiomatics for ordinal proportional representation systems is examined. New axioms for STV are constructed with modification of quota. The new definition of quota improves the theoretical properties of the procedure. A new method is proposed based on STV and the new definition of the quota. A theorem is proved that this method is the only one satisfying these axioms. This method called the weighted inclusive Gregory method with modified quota and random equiprobable choice of the winning coalition on each iteration. Results are extended to the methods that transfer a fractional number of votes.
In this paper we test some hypotheses about individual decision making under risk based on the unique Russian TV show «Sdelka?!» participants behavioral data. The show presents the game where participants are supposed to choose between guaranteed amount of money and lottery which may result in gains or losses. Participants are assumed to make decisions based on prospect theory and cumulative prospect theory including both subjective probability transformation and reference-dependent behavior. Herewith it is assumed that reference point is dynamic so it may change through the game. In order to estimate parameters associated with participants decision making mechanism we propose econometric binary choice model based on quasi maximum likelihood method. The results suggest that contestants adapt reference point depending on the game process. Adaptation seems to be asymmetric since reference point shifts noticeably to the right in response to gains and substantially less to the left if the game goes poorly. In addition, we have found weak evidence in favor of loss aversion effect. In order to demonstrate the robustness of the results we are using various approaches to subjective probabilities transformation. According to Akaike information criteria econometric models incorporating probability transformation are superior to objective probability mode
In 2000s Russian government considered e-auction as the best way to procure goods for public needs. In this paper we confirm this proposition using empirical dataset on 3 thousands contracts for procurement of sugar sand in Russia in 2011. Our data shows that unit prices are higher in the case of long-term contracts. This result can be explained by rigidity of public procurement regulation – because Russian legislation allows only fixed price contracts. Under these conditions suppliers can be ready to participate in public procurement tenders for long-term contracts only if their price includes some “risk premium” covering additional expenses of supplier in case of unfavorable turn in the market. Our analysis shows that sugar prices in Russian public procurement are lower for contracts with higher volume. These results are in the line with conclusions of previous studies of public procurement in other countries. Influence of competition measured by the number of suppliers participating in procurement procedure has quadratic form. It means that the effect of new participant is lower when number of competitors is higher and vice versa. Also our analysis shows that there are essential distinctions in influence of the same factors on contract prices for competitive procedures and void auctions. This result is important for economic policy but additional consideration is needed here.
The aim of this paper is to reveal the main features of the bank lending channel in the Russian economy. Having the answer to this question is important for increasing the efficiency of monetary policy because it will allow to evaluate the extent to which monetary policy impulses affect bank loans that are one of the main source of investments in Russia. The methodology is based on [Kashyap, Stein, 2000]. We analyze monthly data on the individual Russian banks’ balance sheets over the period 2010–2014. In order to take into account considerable differences between different groups of the Russian banks we divide our sample into groups according to their ownership structure, value of net assets, and their main activities. We also address a heterogeneity in the period of time we study. It is shown that there exists a relationship between liquidity level of the Russian banks’ balance sheets, their lending policy and monetary policy impulses depending on the banks’ characteristics and the period of time. We find liquidity “anti-effect” for the big Russian banks during the structural liquidity deficit period: the more liquid their balance sheets, the more these banks substitute corporate loans with liquid assets purchases under conditions of contractionary monetary policy regime. For some groups of medium and small banks that prefer to maintain the volume of corporate loans by selling liquid assets we justify the “classical” Kashyap and Stein liquidity effect. We also find some groups of medium and small banks through which bank lending channel did not work during the period of time we analyze.