Quantcast
Channel: MoneyScience: All site news items

The JCMS Annual Review of the European Union in 2016

$
0
0

Produced in association with the Journal of Common Market Studies (JCMS), the Annual Review covers the major developments in the European Union in the past year.

read more...


Agile Project Management For Dummies, 2nd Edition

$
0
0

Agile project management is a fast and flexible approach to managing all projects, not just software development. By learning the principles and techniques in this book, you'll be able to create a product roadmap, schedule projects, and prepare for product launches with the ease of Agile software developers. You'll discover how to manage scope, time, and cost, as well as team dynamics, quality, and risk of every

read more...

Out-there Scenarios I: ISIS is Funded by a Major Asset Management Firm

$
0
0

I break risk management into three levels, Versions 1.0, 2.0, and 3.0.

Risk Management 1.0 is the standard risk management of VaR and the like, where history is used as a guide, and thus where things work if the future is drawn from the same distribution as the past. Any approach that is looking at risks historically, whether using past prices or variance-covariance relationships or leverage numbers or credit ratings; whether using a normal distribution or a t distribution or a gamma distribution or a part of a distribution like semi-variance, is part of this.  If the future looks like the past in some specific ways, it works; if the futures deviates from the past it might not work.

Risk Management 2.0 is a reaction to the fact that the 1.0 methods failed during the 2008 crisis.  This failure is not surprising or unexpected by most of those working in risk management, because we understand the assumptions behind Version 1.0.  But sometimes this was not articulated well when the numbers were passed up the chain.  In any case, after 2008 risk management started to depend more visibly on stress testing. I say "more visibly" because anyone doing risk management over the past decades has done stress testing in one form or another. Certainly when there are non-linear risk-return tradeoffs, like with option exposures, it is a standard method.  But after 2008 it became de rigor in the analysis of bank risk, for example using CCAR.

And there is Risk Management 3.0, which I won't get into here.  It recognizes that a static stress will miss important dynamics that lead to feedback, contagion, and cascades.  And it is not something that can be readily addressed with the standard economics. You can check out my book, The End of Theory, or some of my papers while I was at the Office of Financial Research to get more on this.

Here I am focused on what we need to do before we can get to these dynamics: We need to know what is triggering a market dislocation. And we are particularly interested in triggers that are large in either magnitude or in the number of agents that are affected.  So even before worrying about the methods for dealing with crisis dynamics, the question to ask is: What can go wrong in a really big way.

I sometimes get at this by starting with something really extreme, and then dialing it back until it can be considered as a reasonable scenario. Reasonable does not mean it is likely to happen, but it also is not "what if an asteroid hits New York" either. Anyway, I want to run through some of the extreme scenarios that I have been thinking about. I'll put one out here and see if anyone responds, either with comments on it in particular, or with others that they are cooking up in a similar vein.

So, Out-there Scenario I: A large asset manager is rumored to be funding ISIS.

Suppose a rumor goes viral that a very large asset management firm is actually owned by, or at least is funding ISIS. This hits all the usual fake news outlets, and is then, of course, bounces into the real news if only as a "there is a rumor, unsubstantiated, making the rounds that...." The result will be large scale redemptions in that asset manager. This will start a downdraft in the markets. It will also lead to questions about other asset managers, and redemptions there as well. The resulting cascade could spread across the markets, erode confidence, and become a major market event.

Now, of course (at least I hope it is obvious) I am not saying specifically that this rumor is likely. But start with this and, as I suggested above, dial it down a bit. The point is, we can come up with scenarios where there can be massive redemptions in some particular major asset manager, and they can be exogenous to anything in the market, and on the face of it might be unreasonable.

One argument against this path to major redemptions hitting the market is that people can redeem by moving their holdings to another asset manager. If they do that there will be no actual selling of assets, and no market impact. This is the way investors will redeem if they continue to want to hold the assets and if they operate with professional aplomb. But the sort of people who would buy into a rumor like this are also likely to simply say, "give me my money", and then figure out what to do after that.

A little footnote: A few years ago the Office of Financial Research did a research study of the asset management community, with the key question being whether the largest asset should be SIFIs (systemically important financial institutions). The report was castigated, especially by the SEC, mostly, I think, because the SEC was honed for inter-agency rivalry. But in any case, no one threw the ISIS scenario into the report.


The Strength of Absent Ties: Social Integration via Online Dating. (arXiv:1709.10478v1 [physics.soc-ph])

$
0
0

We used to marry people to which we were somehow connected to: friends of friends, schoolmates, neighbours. Since we were more connected to people similar to us, we were likely to marry someone from our own race.

read more...

Distributions of Centrality on Networks. (arXiv:1709.10402v1 [cs.SI])

$
0
0

In many social and economic networks, agents' outcomes depend substantially on the centrality of their network position. Our current understanding of network centrality is largely restricted to deterministic settings, but in many applications data limitations or theoretical concerns lead practitioners to use random network models. We provide a foundation for understanding how central agents in random networks are likely to be. Our main theorems show that on large random networks, centrality measures are close to their expected values with high probability. By applying these theorems to stochastic block models, we study how segregated networks contribute to inequality. When networks are segregated, benefits from peer effects tend to accrue unevenly to the advantage of more central individuals and groups. We also discuss applications to more general network formation models, including models where link probabilities are governed by geography.

Obstacle problems for nonlocal operators. (arXiv:1709.10384v1 [math.AP])

$
0
0

We prove existence, uniqueness, and regularity of viscosity solutions to the stationary and evolution obstacle problems defined by a class of nonlocal operators that are not stable-like and may have supercritical drift. We give sufficient conditions on the coefficients of the operator to obtain H\"older and Lipschitz continuous solutions. The class of nonlocal operators that we consider include non-Gaussian asset price models widely used in mathematical finance, such as Variance Gamma Processes and Regular L\'evy Processes of Exponential type. In this context, the viscosity solutions that we analyze coincide with the prices of perpetual and finite expiry American options.

Classification of the Bounds on the Probability of Ruin for L{\'e}vy Processes with Light-tailed Jumps. (arXiv:1709.10295v1 [math.PR])

$
0
0

In this note, we study the ultimate ruin probabilities of a real-valued L{\'e}vy process X with light-tailed negative jumps. It is well-known that, for such L{\'e}vy processes, the probability of ruin decreases as an exponential function with a rate given by the root of the Laplace exponent, when the initial value goes to infinity. Under the additional assumption that X has integrable positive jumps, we show how a finer analysis of the Laplace exponent gives in fact a complete description of the bounds on the probability of ruin for this class of L{\'e}vy processes. This leads to the identification of a case that is not considered in the literature and for which we give an example. We then apply the result to various risk models and in particular the Cram{\'e}r-Lundberg model perturbed by Brownian motion.

A Structural Model for Fluctuations in Financial Markets. (arXiv:1709.10277v1 [q-fin.ST])

$
0
0

In this paper we provide a comprehensive analysis of a structural model for the dynamics of prices of assets traded in a market originally proposed in [1]. The model takes the form of an interacting generalization of the geometric Brownian motion model. It is formally equivalent to a model describing the stochastic dynamics of a system of analogue neurons, which is expected to exhibit glassy properties and thus many meta-stable states in a large portion of its parameter space. We perform a generating functional analysis, introducing a slow driving of the dynamics to mimic the effect of slowly varying macro-economic conditions. Distributions of asset returns over various time separations are evaluated analytically and are found to be fat-tailed in a manner broadly in line with empirical observations. Our model also allows to identify collective, interaction mediated properties of pricing distributions and it predicts pricing distributions which are significantly broader than their non-interacting counterparts, if interactions between prices in the model contain a ferro-magnetic bias. Using simulations, we are able to substantiate one of the main hypotheses underlying the original modelling, viz. that the phenomenon of volatility clustering can be rationalised in terms of an interplay between the dynamics within meta-stable states and the dynamics of occasional transitions between them.


Executive stock option exercise with full and partial information on a drift change point. (arXiv:1709.10141v1 [q-fin.MF])

$
0
0

We analyse the valuation and exercise of an American executive call option written on a stock whose drift parameter falls to a lower value at a change point given by an exponential random time, independent of the Brownian motion driving the stock. Two agents, who do not trade the stock, have differing information on the change point, and seek to optimally exercise the option by maximising its discounted payoff under the physical measure. The first agent has full information, and observes the change point. The second agent has partial information and filters the change point from price observations. Our setup captures the position of an executive (insider) and employee (outsider), who receive executive stock options. The latter yields a model under the observation filtration $\widehat{\mathbb F}$ where the drift process becomes a diffusion driven by the innovations process, an $\widehat{\mathbb F}$-Brownian motion also driving the stock under $\widehat{\mathbb F}$, and the partial information optimal stopping problem has two spatial dimensions. We analyse and numerically solve to value the option for both agents and illustrate that the additional information of the insider can result in exercise patterns which exploit the information on the change point.

October 2, 2017 - SS&C Announces Automation of ILPA Fee Reporting Template

Pareto Efficient Taxation and Expenditures: Pre- and Re-distribution -- by Joseph E. Stiglitz

$
0
0
This paper shows that there is a presumption that Pareto efficient taxation entails a positive tax on capital. When tax and expenditure policies can affect the market distribution of income, those effects need to be taken into account, reducing the burden imposed on distortionary redistribution. The paper extends the 1976 Atkinson-Stiglitz results to a dynamic, overlapping generations model, correcting a misreading of the result on the desirability of a zero capital tax. That result required separability of consumption from labor and that the only unobservable differences among individuals was in (fixed) labor productivities. In a general equilibrium model, one needs to take into account the effects of policy changes on binding self-selection constraints; and with non-separability, capital taxation depends on the complementarity/substitutability of leisure during work with retirement consumption. The final section considers taxation when there are constraints on the imposition of intergenerational transfers (either political constraints or those derived from unobservability.) It constructs a simple two class model, capitalists who maximize dynastic welfare and workers who save for retirement, whose productivity can be enhanced by (publicly provided) education. It derives a simple expression for the optimal capital tax, which is positive, so long as the social welfare function is sufficiently equalitarian and the productivity of educational expenditures are sufficiently high.

Paralyzed by Panic: Measuring the Effect of School Closures during the 1916 Polio Pandemic on Educational Attainment -- by Keith Meyers, Melissa A. Thomasson

$
0
0
We leverage the 1916 polio pandemic in the United States as a natural experiment to test whether short-term school closures result in reduced educational attainment as an adult. With over 23,000 cases of polio diagnosed in 1916, officials implemented quarantines and closed schools. Since the pandemic occurred during the start of the 1916 school year, children of working age may have elected not to return to school. Using state-level polio morbidity as a proxy for schooling disruptions, we find that children ages 14-17 during the pandemic had less educational attainment in 1940 compared to their slightly older peers.

Busy Directors: Strategic Interaction and Monitoring Synergies -- by Alexander Ljungqvist, Konrad Raff

$
0
0
We derive conditions for when having a "busy" director on the board is harmful to shareholders and when it is beneficial. Our model allows directors to condition their monitoring choices on their co-directors' choices and to experience positive or negative monitoring synergies across firms. Whether busyness benefits or harms shareholders depends on whether directors' effort choices are strategic substitutes or complements and on the sign of the cross-firm synergies. Our empirical analysis exploits plausibly exogenous shocks that make directors busier on one board and examines how this spills over to other boards. Our results suggest that monitoring efforts typically are strategic complements, except when a firm finds itself facing a crisis. Consistent with the model, we find that busy directors increase monitoring at spillover firms when synergies are positive (which we show increases expected firm value) and reduce monitoring at spillover firms when synergies are negative (which we show reduces expected firm value).

The End of Free College in England: Implications for Quality, Enrolments, and Equity -- by Richard Murphy, Judith Scott-Clayton, Gillian Wyness

$
0
0
Despite increasing financial pressures on higher education systems throughout the world, many governments remain resolutely opposed to the introduction of tuition fees, and some countries and states where tuition fees have been long established are now reconsidering free higher education. This paper examines the consequences of charging tuition fees on university quality, enrolments, and equity. To do so, we study the English higher education system which has, in just two decades, moved from a free college system to one in which tuition fees are among the highest in the world. Our findings suggest that England's shift has resulted in increased funding per head, rising enrolments, and a narrowing of the participation gap between advantaged and disadvantaged students. In contrast to other systems with high tuition fees, the English system is distinct in that its income-contingent loan system keeps university free at the point of entry, and provides students with comparatively generous assistance for living expenses. We conclude that tuition fees, at least in the English case supported their goals of increasing quality, quantity, and equity in higher education.

Optimal Regulation with Exemptions -- by Louis Kaplow

$
0
0
Despite decades of research on mechanism design and on many practical aspects of cost-benefit analysis, one of the most basic and ubiquitous features of regulation as actually implemented throughout the world has received little theoretical attention: exemptions for small firms. These firms may generate a disproportionate share of harm due to their being exempt and because exemption induces additional harmful activity to be channeled their way. This article analyzes optimal regulation with exemptions where firms have different productivities that are unobservable to the regulator, regulated and unregulated output each cause harm although at different levels, and the regulatory regime affects entry as well as the output choices of regulated and unregulated firms. In many settings, optimal schemes involve subtle effects and have counterintuitive features: for example, higher regulatory costs need not favor higher exemptions, and the incentives of firms to drop output to become exempt can be too weak as well as too strong. A final section examines the optimal use of output taxation alongside regulation, which illustrates the contrast with the mechanism design approach that analyzes the optimal use of instruments of a type that are not in widespread use.

Predicting Relative Returns -- by Valentin Haddad, Serhiy Kozak, Shrihari Santosh

$
0
0
Across a variety of asset classes, we show that relative returns are highly predictable in the time series in and out of sample, much more so than aggregate returns. For Treasuries, slope is more predictable than level. For equities, dominant principal components of anomaly long-short strategies are more predictable than the market. For foreign exchange, a carry portfolio is more predictable than a basket of all currencies against the dollar. We show the commonly used practice to predict each individual asset is often equivalent to predicting only their first principal component, the index, which obscures the predictability of relative returns. Our findings highlight that focusing on important dimensions of the cross-section allows one to uncover additional economically relevant and statistically robust patterns of predictability.

The Employment Effects of Mexican Repatriations: Evidence from the 1930's -- by Jongkwan Lee, Giovanni Peri, Vasil Yasenov

$
0
0
During the period 1929-34 a campaign forcing the repatriation of Mexicans and Mexican Americans was carried out in the U.S. by states and local authorities. The claim of politicians at the time was that repatriations would reduce local unemployment and give jobs to Americans, alleviating the local effects of the Great Depression. This paper uses this episode to examine the consequences of Mexican repatriations on labor market outcomes of natives. Analyzing 893 cities using full count decennial Census data in the period 1930-40, we find that repatriation of Mexicans was associated with small decreases in native employment and increases in native unemployment. These results are robust to the inclusion of many controls. We then apply an instrumental variable strategy based on the differential size of Mexican communities in 1930, as well as a matching method, to estimate a causal "average treatment effect." Confirming the OLS regressions, the causal estimates do not support the claim that repatriations had any expansionary effects on native employment, but suggest instead that they had no effect on, or possibly depressed, their employment and wages.

Directed Search: A Guided Tour -- by Randall Wright, Philipp Kircher, Benoit Julien, Veronica Guerrieri

$
0
0
This essay surveys the literature on directed/competitive search, covering theory and applications in, e.g., labor, housing and monetary economics. These models share features with traditional search theory, yet differ in important ways. They share features with general equilibrium theory, but with explicit frictions. Equilibria are typically efficient, in part because markets price goods plus the time required to get them. The approach is tractable and arguably realistic. Results are presented for finite and large economies. Private information and sorting with heterogeneity are analyzed. Some evidence is discussed. While emphasizing issues and applications, we also provide several hard-to-find technical results.

A Tough Act to Follow: Contrast Effects In Financial Markets -- by Samuel M. Hartzmark, Kelly Shue

$
0
0
A contrast effect occurs when the value of a previously-observed signal inversely biases perception of the next signal. We present the first evidence that contrast effects can distort prices in sophisticated and liquid markets. Investors mistakenly perceive earnings news today as more impressive if yesterday's earnings surprise was bad and less impressive if yesterday's surprise was good. A unique advantage of our financial setting is that we can identify contrast effects as an error in perceptions rather than expectations. Finally, we show that our results cannot be explained by a key alternative explanation involving information transmission from previous earnings announcements.

The Distribution of Environmental Damages -- by Solomon Hsiang, Paulina Oliva, Reed Walker

$
0
0
Most regulations designed to reduce environmental externalities impose costs on individuals and firms. An active body of research has explored how these costs are disproportionately born by different sectors of the economy and/or across different groups of individuals. However, much less is known about the distributional characteristics of the environmental benefits created by these policies, or conversely, the differences in environmental damages associated with existing environmental externalities. We review this burgeoning literature and develop a simple and general framework for focusing future empirical investigations. We apply our framework to findings related to the economic impact of air pollution, deforestation, and climate, highlighting important areas for future research. A recurring challenge to understanding the distributional effects of environmental damages is distinguishing between cases where (i) populations are exposed to different levels or changes in an environmental good, and (ii) where an incremental change in the environment may have very different implications for some populations. In the latter case, it is often difficult to empirically identify the underlying sources of heterogeneity in marginal damages, as damages may stem from either non-linear and/or heterogeneous damage functions. Nonetheless, understanding the determinants of heterogeneity in environmental benefits and damages is crucial for welfare analysis and policy design.




Latest Images