Risk Assessment in Grid Computing
In: Possibility for Decision; Studies in Fuzziness and Soft Computing, S. 145-165
536 Ergebnisse
Sortierung:
In: Possibility for Decision; Studies in Fuzziness and Soft Computing, S. 145-165
In: Infosecurity Today, Band 2, Heft 5, S. 22-25
Advances in e-Infrastructure promise to revolutionize sensing systems and the way in which data are col- lected and assimilated, and complex water systems are simulated and visualized. According to the EU Infrastructure 2010 work-programme, data and compute infrastructures and their underlying technolo- gies, either oriented to tackle scientific challenges or complex problem solving in engineering, are expected to converge together into the so-called knowledge infrastructures, leading to a more effective research, education and innovation in the next decade and beyond. Grid technology is recognized as a fundamental component of e-Infrastructures. Nevertheless, this emerging paradigm highlights several topics, including data management, algorithm optimization, security, performance (speed, throughput, bandwidth, etc.), and scientific cooperation and collaboration issues that require further examination to fully exploit it and to better inform future research policies. The paper illustrates the results of six dif- ferent surface and subsurface hydrology applications that have been deployed on the Grid. All the appli- cations aim to answer to strong requirements from the Civil Society at large, relatively to natural and anthropogenic risks. Grid technology has been successfully tested to improve flood prediction, ground- water resources management and Black Sea hydrological survey, by providing large computing resources. It is also shown that Grid technology facilitates e-cooperation among partners by means of services for authentication and authorization, seamless access to distributed data sources, data protection and access right, and standardization.
BASE
In: Social science computer review: SSCORE, Band 26, Heft 3, S. 301-316
ISSN: 1552-8286
Qualitative research is increasingly important in policy-related and applied work, as well as in academic work. Grid and high-performance computing (HPC) technologies promise significant potential returns for qualitative researchers. Tagged cyber-research in the United States and e-social science in the United Kingdom (and e-research in general), the application of HPC technologies can enhance the scope, depth, and rigor of qualitative inquiry by enabling new data-handling capacities and analytic procedures; new support for work with colleagues based elsewhere; and new facilities to archive, curate, and exploit the many kinds of data that qualitative researchers use. From these resources flow new challenges to conventions of privacy and research ethics, data integrity and data protection, and the relations between scientific communities and society. Based on a survey, individual interviews, and group discussions, involving qualitative researchers and computer scientists, this article scans existing applications of grid and HPC technologies to qualitative research; indicates potential applications; and identifies associated ethical, practical, and technological challenges.
In: Publication 2009,01
In: Transforming Government: People, Process and Policy, Band 4, Heft 4, S. 288-298
PurposeThe purpose of this paper is to examine three different, but related, distributed computing technologies in the context of public‐funded e‐science research, and to present the author's viewpoint on future directions.Design/methodology/approachThe paper takes a critical look at the state‐of‐the‐art with regard to three enabling technologies for e‐science. It forms a set of arguments to support views on the evolution of these technologies in support of the e‐science applications of the future.FindingsAlthough grid computing has been embraced in public‐funded higher education institutions and research centres as an enabler for projects pertaining to e‐science, the adoption of desktop grids is low. With the advent of cloud computing and its promise of on‐demand provisioning of computing resources, it is expected that the conventional form of grid computing will gradually move towards cloud‐based computing. However, cloud computing also brings with it the "pay‐per‐use" economic model, and this may act as stimulus for organisations engaged in e‐science to harvest existing underutilised computation capacity through the deployment of organisation‐wide desktop grid infrastructures. Conventional grid computing will continue to support future e‐science applications, although its growth may remain stagnant.Originality/valueThe paper argues that there will be a gradual shift in the underlying distributed computing technologies that support e‐science applications of the future. While cloud computing and desktop grid computing will gain in prominence, the growth of traditional cluster‐based grid computing may remain dormant.
In: IEEE antennas & propagation magazine, Band 45, Heft 2, S. 91-99
ISSN: 1558-4143
In: International journal of critical infrastructures: IJCIS, Band 4, Heft 3, S. 308
ISSN: 1741-8038
In: International journal of critical infrastructures: IJCIS, Band 2, Heft 4, S. 412
ISSN: 1741-8038
In: Asian journal of research in social sciences and humanities: AJRSH, Band 6, Heft 9, S. 1430
ISSN: 2249-7315
In: Computer Communications
In this paper, we propose an efficient non-linear task workload prediction mechanism incorporated with a fair scheduling algorithm for task allocation and resource management in Grid computing. Workload prediction is accomplished in a Grid middleware approach using a non-linear model expressed as a series of finite known functional components using concepts of functional analysis. The coefficient of functional components are obtained using a training set of appropriate samples, the pairs of which are estimated based on a runtime estimation model relied on a least squares approximation scheme. The advantages of the proposed non-linear task workload prediction scheme is that (i) it is not constrained by analysis of source code (analytical methods), which is practically impossible to be implemented in complicated real-life applications or (ii) it does not exploit the variations of the workload statistics as the statistical approaches does. The predicted task workload is then exploited by a novel scheduling algorithm, enabling a fair Quality of Service oriented resource management so that some tasks are not favored against others. The algorithm is based on estimating the adjusted fair completion times of the tasks for task order selection and on an earliest completion time strategy for the grid resource assignment. Experimental results and comparisons with traditional scheduling approaches as implemented in the framework of European Union funded research projects GRIA and GRIDLAB grid infrastructures have revealed the outperformance of the proposed method. (c) 2005 Elsevier B.V. All rights reserved.
BASE
El procesamiento masivo de información obtenida a través de sensores LiDAR (Light Detection and Ranging) excede fácilmente las posibilidades de procesamiento de los ordenadores convencionales. Actualmente, organizaciones públicas y privadas acumulan grandes colecciones de datos derivados de este tipo de sensor sin que los usuarios puedan tener acceso a ellos de manera ágil y eficiente. El elevado coste de las licencias y la complejidad del software necesario para procesar un conjunto de datos derivados de sensores LiDAR, reduce significativamente el número de usuarios con herramientas para su explotación a un número limitado de proveedores. Ante esta perspectiva, han surgido nuevos esfuerzos que se concentran en hacer que esta información sea accesible para cualquier usuario. En este artículo se discuten algunas de las soluciones que sirven de apoyo al procesamiento remoto y la accesibilidad de datos LiDAR mediante el uso del estándar OpenGIS Web Processing Service implementado en una arquitectura GRID Computing. Se identifican los resultados de investigaciones recientes y los avances alcanzados en el marco de las Infraestructuras de Datos Espaciales. Estos trabajos facilitan el tratamiento, distribución y acceso a los datos en cuestión, y son la base para futuros estudios y propuestas locales y regionales
BASE
In: The journal of strategic information systems, Band 22, Heft 2, S. 137-156
ISSN: 1873-1198