The combination of experts' probability distributions involved in a due diligence is valuable for encapsulating the accumulated information for decision makers and providing the current state of expert opinion regarding important uncertainties. Therefore, this paper shows how to create and combinate experts' probability distributions which can be used in a monte-carlo simulation to calculate company values.
Human Immunodeficiency Virus 1 (HIV-1) evades adaptive immunity by means of its extremely high mutation rate, which allows the HIV envelope glycoprotein to continuously escape from the action of antibodies. However, some broadly neutralizing antibodies (bNAbs) targeting specific viral regions show the ability to block the infectivity of a large number of viral variants. The discovery of these antibodies opens new avenues in anti-HIV therapy; however, they are still suboptimal tools as their amplitude of action ranges between 50% and 90% of viral variants. In this context, being able to discriminate between sensitive and resistant strains to an antibody would be of great interest for the design of optimal clinical antibody treatments and to engineer potent bNAbs for clinical use. Here, we describe a hierarchical procedure to predict the antibody neutralization efficacy of multiple viral isolates to three well-known anti-CD4bs bNAbs: VRC01, NIH45-46 and 3BNC117. Our method consists of simulating the three-dimensional binding process between the gp120 and the antibody by using Protein Energy Landscape Exploration (PELE), a Monte Carlo stochastic approach. Our results clearly indicate that the binding profiles of sensitive and resistant strains to a bNAb behave differently, showing the latter's weaker binding profiles, that can be exploited for predicting antibody neutralization efficacy in hypermutated HIV-1 strains. ; This research was funded by a predoctoral fellowship from the Government of Catalonia (2020FI_B2_00138 to P.A.-R.). The research was conducted under funds from the Spanish Government, grant PID2019-106370RB-I00. J.B. laboratory is funded by the Spanish Institute of Health Carlos III (ISCIII, projects PI17/01518 and PI20/00093) and by MAC Foundation (project E-C-YY-24414). ; Peer Reviewed ; Postprint (published version)
The increased awareness on sustainability matters is contributing to the evolution of energy and environmental policies for the building sector at the EU level, oriented toward resource efficiency. There exist today several possible strategies to model building performance through the life cycle. The increase of available computational capacity and of data acquisition capability is opening new scenarios for practical applications, which can contribute to the reduction of the gap usually encountered between simulated and measured energy performance. This article aims to investigate an approach for probabilistic building performance simulation to be used across life cycle phases, employing reduced-order models for performance monitoring and energy management. The workflow proposed aims to establish a continuity among design and operation phases. Design phase simulation is generally subject to relevant temporal and economic constraints and a successful workflow should incorporate elements from current design practices but should also add new features, which have to be reasonably automated to reduce additional effort. Therefore, the workflow proposed is automated and tested for robustness using Monte Carlo technique. In the design phase, the approach can be used for identifying probabilistic performance bounds suitable for risk analysis in energy efficiency investments, employing cost-optimal or life cycle cost accounting methodologies. In the operation phase, it can be used for performance monitoring and energy management based on daily energy consumption analysis, similarly to other multivariate regression-based methods at the state of the art, addressing the problem of maintaining energy consumption and related costs constantly under control.
"In conventional stochastic simulation algorithms, Monte Carlo integration and curve fitting are merged together and implemented by means of regression. We perform a decomposition of the solution error and show that regression does a good job in curve fitting but a poor job in integration, which leads to low accuracy of solutions. We propose a generalized notion of stochastic simulation approach in which integration and curve fitting are separated. We specifically allow for the use of deterministic (quadrature and monomial) integration methods which are more accurate than the conventional Monte Carlo method. We achieve accuracy of solutions that is orders of magnitude higher than that of the conventional stochastic simulation algorithms"--National Bureau of Economic Research web site
The input required for a seismic hazard study using conventional Probabilistic Seismic Hazard assessment (PSHA) methods can also be used for probabilistic analysis of hazard using Monte Carlo simulation methods. This technique is very flexible, and seems to be under-represented in the literature. It is very easy to modify the form of the seismicity model used, for example, to introduce non-Poissonian behaviour, without extensive reprogramming. Uncertainty in input parameters can also be modelled very flexibly - for example, by the use of a standard deviation rather than by the discrete branches of a logic tree. In addition (and this advantage is perhaps not as trivial as it may sound) the simplicity of the method means that its principles can be grasped by the layman, which is useful when results have to be explained to people outside the seismological/engineering communities, such as planners and politicians. In this paper, some examples of the Monte Carlo method in action are shown in the context of a low to moderate seismicity area: the United Kingdom.