Voting describes a joint decision making process where voters choose winners out of a set of candidates. Many voting systems entail difficult computational problems. This work investigates the winner and possible winner determination and the influence an external agent can exert by adding or deleting candidates. Taking into account the computational difficulty, that is, NP-hardness of the studied problems, the work applies a multivariate complexity analysis aiming at identifying tractable scenarios or showing intractability.
Voting describes a joint decision making process where voters choose winners out of a set of candidates. Many voting systems entail difficult computational problems. This work investigates the winner and possible winner determination and the influence an external agent can exert by adding or deleting candidates. Taking into account the computational difficulty, that is, NP-hardness of the studied problems, the work applies a multivariate complexity analysis aiming at identifying tractable scenarios or showing intractability.
Integrative complexity reflects the degree to which the source of a communication perceives several dimensions and points of view relevant to the topic (differentiation) and the degree to which such characteristics are seen as related to each other (integration). During international crises, bilateral decreases in the integrative complexity of communications frequently precede the outbreak of war; a unilateral decrease reliably precedes surprise strategic attacks. In the current study, complexity was scored in the messages of selected leaders from before to approximately a month after the terrorist attacks of 11 September 2001. Even this limited database replicated some of the complexity patterns found previously, as well as showing some novel characteristics. This was the first application of the method to hostilities other than inter–nation or civil wars.
Diagnosis of mild traumatic brain injuries (TBIs) has been difficult because of the absence of obvious focal brain lesions, using conventional computed tomography (CT) or magnetic resonance imaging (MRI) scans, in a large percentage of TBIs. One useful measure that can characterize potential tissue and neural network damage objectively is Lempel–Ziv complexity (LZC) applied to magnetoencephalography (MEG) signals. LZC is a model-independent estimator of system complexity that estimates the number of different patterns in a sequence. We hypothesized that because of the potential network damage, TBIs would show a reduced level of complexity in regions that are impaired. We included 18 healthy controls and 18 military veterans with TBI in the study. Resting state MEG data were acquired, and the LZCs were analyzed across the whole brain. Our results indicated reduced complexity in multiple brain areas in TBI patients relative to the healthy controls. In addition, we detected several neuropsychological measures associated with motor responses, visual perception, and memory, correlated with LZC, which likely explains some of the cognitive deficits in TBI patients.
[EN] The coexistence of multiple air interface variants in the upcoming fifth generation (5G) wireless technology remains a matter of ongoing discussion. This paper focuses on the physical layer of the 5G air interface and provides a harmonization solution for the joint implementation of several multicarrier waveform candidates. Waveforms based either on cyclic prefix-orthogonal frequency division multiplexing (CP-OFDM) or on filter bank multicarrier (FBMC) are first presented through a harmonized system model. Complexity comparisons among five different waveforms are provided. Then, the complexity of a proposed configurable hardware implementation setup for waveform transmission and reception is evaluated. As a result, the harmonized transmitter and receiver exhibit 25¿40% and 15¿25% less complexity in floating-point operations, respectively, in comparison to two standalone implementations of the most complex waveform instances of the CP-OFDM and FBMC families. This highlights the similarities between both families and illustrates the component reuse advantages associated with the proposed harmonized solution. ; This work was performed in the framework of the H2020 Project METIS-II with reference 671680, which is partly funded by the European Union. The authors would like to acknowledge the contributions of their colleagues in METIS-II. This work was also supported in part by the Ministerio de Economia y Competitividad, under Grant TEC2014-60258-C2-1-R. ; Garcia-Roger, D.; Roger Varea, S.; Flores De Valgas, J.; Monserrat, JF. (2017). Multicarrier Waveform Harmonization and Complexity Analysis for an Efficient 5G Air Interface Implementation. Wireless Communications and Mobile Computing. 2017:1-11. https://doi.org/10.1155/2017/9765614 ; S ; 1 ; 11 ; 2017
One-hertz wind time series recorded at different levels (from 1.5–25.5 m) in an urban area are investigated by using the Fisher–Shannon (FS) analysis. FS analysis is a well-known method to gain insight into the complex behavior of nonlinear systems, by quantifying the order/disorder properties of time series. Our findings reveal that the FS complexity, defined as the product between the Fisher information measure and the Shannon entropy power, decreases with the height of the anemometer from the ground, suggesting a height-dependent variability in the order/disorder features of the high-frequency wind speed measured in urban layouts. Furthermore, the correlation between the FS complexity of wind speed and the daily variance of the ambient temperature shows a similar decrease with the height of the wind sensor. Such correlation is larger for the lower anemometers, indicating that ambient temperature is an important forcing of the wind speed variability in the vicinity of the ground.
Starting from the presumption that writing style is proven to be a reliable predictor of comprehension, this paper investigates the extent to which textual complexity features of nurse students' essays are related to the scores they were given. Thus, forty essays about case studies on infectious diseases written in French language were analyzed using ReaderBench, a multi-purpose framework relying on advanced Natural Language Processing techniques which provides a wide range of textual complexity indices. While the linear regression model was significant, a Discriminant Function Analysis was capable of classifying students with an 82.5% accuracy into high and low performing groups. Overall, our statistical analysis highlights essay features centered on document cohesion flow and dialogism that are predictive of teachers' scoring processes. As text complexity strongly influences learners' reading and understanding, our approach can be easily extended in future developments to e-portfolios assessment, in order to provide customized feedback to students. ; This study is part of the RAGE project. The RAGE project has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 644187. This publication reflects only the author's view. The European Commission is not responsible for any use that may be made of the information it contains.
Research suggests that the integrative complexity of political rhetoric tends to drop during election season, but little research to date directly addresses if this drop in complexity serves to increase or decrease electoral success. The two present studies help fill this gap. Study 1 demonstrates that, during the Democratic Party primary debates in 2003–2004, the eventual winners of the party nomination showed a steeper drop in integrative complexity as the election season progressed than nonwinning candidates. Study 2 presents laboratory evidence from the most recent presidential campaign demonstrating that, while the complexity of Obama's rhetoric had little impact on college students' subsequent intentions to vote for him, the complexity of McCain's rhetoric was significantly positively correlated with their likelihood of voting for him. Taken together, this research is inconsistent with an unqualified simple is effective view of the complexity‐success relationship. Rather, it is more consistent with a compensatory view: Effective use of complexity (or simplicity) may compensate for perceived weaknesses. Thus, appropriately timed shifts in complexity levels, and/or violations of negative expectations relevant to complexity, may be an effective means of winning elections. Surprisingly, mere simplicity as such seems largely ineffective.
Green behaviors adopted by supply chain companies are conducive to resource conservation and environmental protection and enhancing their core competitive advantages. By constructing a game model of green behavior of supply chain companies, this research deeply analyzes the main influencing factors of green behaviors adopted by supply chain companies. It uses dynamic evolution game analysis and simulation experiment methods to explore the path evolution direction and dynamic convergence process of green behavior strategy choices of these companies, so as to provide reference value for green behavior decision-making of supply chain enterprises. The research results show that the probability of supply chain enterprises choosing green behavior strategies is related to factors such as enterprise green investment income and costs, co-benefits, spillover benefits, greenness and output of raw materials or products, government green subsidy coefficients, and fines. Supply chain enterprises should reduce the cost of green investment, maximize the profit of green investment, and increase the greenness of raw materials or products ; the government should increase the coefficient of green subsidies and encourage supply chain enterprises to actively participate in the collaborative management of the green supply chain.
This publication is a work of the U.S. Government as defined in Title 17, United States Code, Section 101. As such, it is in the public domain, and under the provisions of Title 17, United States Code, Section 105, may not be copyrighted. ; Proceedings of the 28th Annual Allerton Conference on Communication, Control, and Computing, Sept. 1990, regular (full)paper, pp. 948-957 (Unrefereed)1989 ; We analyze the computational complexity of the cost-table approach to designing multiple alued logic circuits that is applicable to I L, CCD's, current-mode CMOS, and RTD's. We s 2 how that this approach is NP-complete. An efficient algorithm is shown for finding the exact I minimal realization of a given function by a given cost-table.
This publication is a work of the U.S. Government as defined in Title 17, United States Code, Section 101. As such, it is in the public domain, and under the provisions of Title 17, United States Code, Section 105, may not be copyrighted. ; IEEE Transactions on Computers, February 1997, pp. 205-209 ; —We analyze the computational complexity of the cost-table approach to designing multiple-valued logic circuits that is applicable to I 2 L, CCDs, current-mode CMOS, and RTDs. We show that this approach is NP-complete. An efficient algorithm is shown for finding the exact minimal realization of a given function by a given cost-table.