SIMULATION AND TRAINING - Ammo Shortages Undermine Navy, Marine Corps Training
In: National defense, Heft 576, S. 74-77
ISSN: 0092-1491
204 Ergebnisse
Sortierung:
In: National defense, Heft 576, S. 74-77
ISSN: 0092-1491
Mobile robot sensors have appeared as techniques for tracking the environment, search and rescue, exploration and mapping, civil infrastructure analysis, and military operations. Discovery and tracking of polluted areas with sensor mobile multi-robots is now regarded the solution to environmental and human safety issues. This paper shows some algorithms intended to allow mobile multi robots with sensor to estimate and monitor for polluted an irregular area in a synchronized way. Changes in the behavior of dangerous environmental boundaries, such as fire spreads or oil spills, provide appropriate data to mitigate the issue or even support evacuation actions to save human or animal life. In this paper, we present a model using a two-robot called it Boundary Detection Robot (BDR) moving around an environmental boundary to predict its shape through a continuous analytical function based on the combination of polynomial approximation. These robots are composed of many sensors each with embedded processors, wireless communication, and movement capabilities. We explain that when we increase the sample frequency and the robot velocity, the strategy converges to the exact boundary. We conducted experiments with simulated and actual robots to assess the estimation quality. We analyze the reliability of the control unit and other component in a robot simulator and assess the efficiency of the all components in a realistic set-up and environment model. We implement a stable transmission range of robot control laws with sensors to track irregular area boundaries.
BASE
In: Sage open, Band 14, Heft 2
ISSN: 2158-2440
As social robots may be used by a single user or multiple users different social scenarios are becoming more important for defining human-robot relationships. Therefore, this study explored human-robot relationships between robots and users in different interaction modes to improve user interaction experience. Specifically, education and companion were selected as the most common areas in the use of social robots. The interaction modes used include single-user interaction and multi-user interaction. The three human-robot relationships were adopted. The robot competence scale, human-robot trust scale, and acceptance of robot scale were used to evaluate subjects' views on robots. The results demonstrate that in the two scenarios, people were more inclined to maintain a more familiar and closer relationship with the social robot when the robot interacted with a single user. When multiple persons interact in an education scenario, setting the robot to Acquaintance relationships is recommended to improve its competence and people's trust in the robot. Similarly, in multi-person interaction, Acquaintance relationships would be more accepted and trusted by people in a companion scenario. Based on these results, robot sensors can be added to further optimize human-robot interaction sensing systems. By identifying the number of users in the interaction environment, robots can automatically employ the best human-robot relationship for interaction. Optimizing human-robot interaction sensing systems can also improve robot performance perceived in the interaction to meet different users' needs and achieve more natural human-robot interaction experiences.
This article is an open-access article distributed under the terms and conditions of the Creative Commons Attribution license.-- et al. ; In this article we explain the architecture for the environment and sensors that has been built for the European project URUS (Ubiquitous Networking Robotics in Urban Sites), a project whose objective is to develop an adaptable network robot architecture for cooperation between network robots and human beings and/or the environment in urban areas. The project goal is to deploy a team of robots in an urban area to give a set of services to a user community. This paper addresses the sensor architecture devised for URUS and the type of robots and sensors used, including environment sensors and sensors onboard the robots. Furthermore, we also explain how sensor fusion takes place to achieve urban outdoor execution of robotic services. Finally some results of the project related to the sensor network are highlighted. ; This work has been supported by URUS, the Ubiquitous Robotics in Urban Settings project, funded by the European Commission (FP6-IST-045062), by CONET, the Cooperating Objects Network of Excellence (FP7-2007-2-224053), by the Spanish National projects UbRob, Ubiqitous Robotics in Urban Settings (DPI2007-61452) and PAU, Perception and Action under Uncertainty (DPI2008-06022) of the DPI program; and MIPRCV, Multimodal Interaction in Pattern Recognition and Computer Vision of the Consolider Ingenio 2010 program (CSD2007-00018); and by the project SIRE, funded by the Andalusian Government (P06-TEP-01494). ; Peer Reviewed
BASE
These robot used in military are usually employed with the integrated system, including video screens, sensors, gripper and cameras. Android application controlled WARFARE ROBOT, built in with Robotic arm mechanism to pick up or place small objects like explosives, an on board Wireless video camera, Infrared based surface depth and irregularities perception and android application for movement and other controls of the Robot. The robot will serve as an appropriate gadget for the defence sector to reduce the loss of human life. Sushmita Shivalkar | Geeta Yadav | Swapnali Patil | Sakshi Dale "Warfare Robot" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-3 | Issue-3 , April 2019, URL: https://www.ijtsrd.com/papers/ijtsrd22888.pdf
BASE
In: World futures review: a journal of strategic foresight, Band 6, Heft 3, S. 251-260
ISSN: 2169-2793
The first industrial robot was set along the production line in Japan in 1969. Through the miracle recovery of the great economic growth in post-war Japan, more robots were developed. Today, we see robots operating in many environments other than indoor assembly lines. Meanwhile, robots truly have come of age as a result of amazing advances in programming (artificial intelligence) and sensor-information technology. In the future, we will find more super-intelligent humanoids coexisting with us, and cities will be structured as robot-friendly environments where not only humanoids but ubiquitous machines of all kinds with built-in computer, sensors, and actuators will enhance and supplement human existence in countless ways.
Mobile robots are used in various application areas including manufacturing, mining, military operations, search and rescue missions and so on. As such there is a need to model robot mobility that tracks robot system modules such as navigation system and visi on based object recognition. For the navigation system it is important to locate the position of the robot in surr ounding environment. Then it has to plan a path towards desired destination. The navigation system of a robot has to identify all potential obstacles in order to find a suitable path. The objective of this research is to develop a simulation system to identify difficulties facing mobile robot navigation in industrial environments, and then tackle these problems effectively. The simulation makes use of information provided by various sensors including vision, range, and force sensors. With the help of battery operated mobile robots it is possible to move objects around in any industry/manufacturing plant and thus minimize environmental impact due to carbon emissions and pollution. The use of such robots in industry also makes it safe to deal with hazardous materials. In industry, a mobile robot deals with many tools and equipment and therefore it has to detect and recognize these objects and then track them. In this paper, the object detection and recognition is based on vision sensors and then image processing techniques. Techniques cove red include Speeded Up Ro bust Features (SURF), template matching, and colour segmentation. If the robot detects the target in its view, it will track the target and then grasp it. However, if the object is not in the current view, the robot continues its search to find it. To make the mobile robot move in its environment, a number of basic path planning strategies have been used. In the navigation system, the robot navigates to the nearest wall (or similar obstacle) and then moves along that obstacle. If an obstacle is detected by the robot using the built-in ultrasonic range sensor, the robot will navigate around that obstacle and then continue moving along it. While the robot is self-navigating in its environment, it continues to look for the target. The robot used in this work robot is scalable for industrial applications in mining, search and rescue missions, and so on. This robot is environmentally friendly and does not produce carbon emissions. In this paper the simulation of path planning algorithm for an autonomous robot is presented. Results of modelling the robot in a real-world industrial environment for testing the robot's navigation are also discussed.
BASE
In: Computers and Electronics in Agriculture, Band 147, S. 91-108
In: Springer eBook Collection
1 Introduction -- Robotic Practice 10; Exploiting Mathematics 11; Making Sense of Sensors 12; Computing for Design 13; Future Directions 14 -- I: Sensor Information Processing -- 2 A method for grasping randomly oriented objects using touch sensing -- 3 Method of contour recognition -- 4 The design of sensors for a mobile teleoperator robot -- II: Mathematical Concerns -- 5 Constrained average path tracking for industrial robots -- 6 The application of spline functions to trajectory generation for computer-controlled manipulators -- 7 Kinematic equations of robot manipulators -- 8 Solution of kinematic equations for robot manipulators -- III: Practical Concerns -- 9 A strategy to achieve an assembly by means of an inaccurate, flexible robot -- 10 Trajectory planning for a multi-arm robot in an assembly task -- 11 Cooperation of two manipulators in assembly tasks -- IV: Computer Aids to Robot Design -- 12 A CAD system for programming and simulating robots' actions -- 13 The development of a suite of programs for the analysis of mechanisms.
This paper presents a general framework for planning a multipurpose robot which can be used in multiple fields (both civil and military). The framework shows the assembly of multiple sensors, mechanical arm, live video streaming and high range remote control and so on in a single robot. The planning problem is one of five fundamental challenges to the development of a real robotic system able to serve both purposes related to military and civil like live surveillance(both auto and manual), rescuing under natural disaster aftermath, firefighting, object picking, hazard like ignition, volatile gas detection, exploring underground mine or even terrestrial exploration. Each of the four other areas – hardware design, programming, controlling and artificial intelligence are also discussed.
BASE
This paper proposes the application of a low-cost gas sensor array in an assistant personal robot (APR) in order to extend the capabilities of the mobile robot as an early gas leak detector for safety purposes. The gas sensor array is composed of 16 low-cost metal-oxide (MOX) gas sensors, which are continuously in operation. The mobile robot was modified to keep the gas sensor array always switched on, even in the case of battery recharge. The gas sensor array provides 16 individual gas measurements and one output that is a cumulative summary of all measurements, used as an overall indicator of a gas concentration change. The results of preliminary experiments were used to train a partial least squares discriminant analysis (PLS-DA) classifier with air, ethanol, and acetone as output classes. Then, the mobile robot gas leak detection capabilities were experimentally evaluated in a public facility, by forcing the evaporation of (1) ethanol, (2) acetone, and (3) ethanol and acetone at different locations. The positive results obtained in different operation conditions over the course of one month confirmed the early detection capabilities of the proposed mobile system. For example, the APR was able to detect a gas leak produced inside a closed room from the external corridor due to small leakages under the door induced by the forced ventilation system of the building ; This work was partially funded by the Spanish MINECO program, under grants DPI2017-89827-R (ALPE), BES-2015-071698 (Severo-Ochoa), and TEC2014-59229-R (SIGVOL), and by the European Research Council (ERC) under the European Union's Horizon 2020 research and innovation program (grant agreement n ◦ H2020-780262-SHARE4RARE). This work received support from the Departament d'Universitats, Recerca i Societat de la Informació de la Generalitat de Catalunya (expedients 2017 SGR 1721 and 2017 SGR 952) and the Networking Biomedical Research Center in the subject of area of Bioengineering, Biomaterials, and Nanomedicine (CIBER-BBN). This work was partially funded by ACCIÓ, Agència per la Competitivitat de l'Empresa, under grant INNOTECRD18-1-0054 Innotec. J.F. acknowledges the support from the Serra Húnter program.
BASE
In: Institute of Scholars (InSc), 2020
SSRN
In: Springer eBook Collection
1: Introduction -- 2: Why use robots? -- Robot versus hard automation -- Robot availability, purchase and viability -- Assessing the robot market -- 3: Which configuration? -- Specialized robots -- Robot capability -- Which programming method? -- Important elements in robot specifications -- 4: Calculation of cycle times -- Methods for calculating cycle time -- Parts trees -- Assessing workload -- 5: Grippers -- Sophisticated gripper versus simple gripper -- Compliance -- Types of grippers -- Gripper design -- Sensory control of grippers -- Gripper classifications -- Multiple robots -- Multiple arm robots -- Multiple grippers -- 6: The assembly process -- Assembly techniques -- 7: Product and process design for assembly -- Product compatibility -- Method of construction -- Fifteen design rules -- 8: Workstations -- The system -- Assembly line balancing -- Balancing a robot line -- Implementation of workstations -- 9: Material feeders -- Automatic feeders -- Feeding delicate items -- Component manufacture at site of usage -- Feeding consumable materials -- Conveyors -- Automated guided vehicles -- Prepackaged material control -- 10: Sensing and vision -- Sensors -- Selecting a suitable sensor -- Automatic inspection -- 11: Man-machine mix -- Interaction between man and machine -- Programming -- Man-robot systems -- 12: Safety -- Humans at risk from injury by robots -- Safety procedures and devices -- Procedural checks -- 13: Evaluation of a robot system -- Methods of financial appraisal -- Strategic and tactical justification -- Productivity ratios -- Robot versus manual cost per hour -- Resource graphs -- Cost groups -- The proposal -- 14: Economics of alternative systems -- Assessing costs -- Equilateral triangle -- Direct calculation -- Benefits and total expected savings -- Systems for a range of quantities -- Flexible and fixed aspects of automated assembly -- 15: Economics of robots and grippers -- The workstation -- The gripper design -- 16: The future -- The short-term outlook -- The long-term outlook -- References and bibliography -- Appendix: Assembly Robots Available in the USA and the UK.
SSRN
Working paper
Recently, autonomous robot teams have been implemented broadly in many social and military applications such as firefighting, agriculture, search and rescue, mapping, target tracking, and docking. A mix of different types of ground robots and aerial vehicles can be employed in a robot team to accomplish tasks efficiently and robustly. Such heterogeneous systems show unparalleled benefits in complex tasks compared to teams composed of identical robot types. In a heterogeneous robot team, precise relative localization, i.e., estimating a robot's position with respect to its neighbor robots, plays a key role. We develop a relative localization system for air-ground robot teams where an aerial vehicle and multiple ground robots work in coordination to perform a reliable relative position estimation. The aerial vehicle is employed to detect special patterns on the ground robots by an onboard monocular camera, while the ground robots perform relative position estimation based on inter-robot distances acquired by ultrawideband sensors and the bearing and heading angles received from the aerial vehicle by communication. Thus, the aerial vehicle serves as an absolute frame provider for the entire team. Notably, each robot in the team uses onboard communication and computation capabilities solely without any need for an external localization infrastructure, making the team realizable in all conditions including GNSS-denied environments. We propose a multi-rate extended Kalman filter algorithm to handle different data rates of the sensor measurements. We carried out an extensive simulation study with a drone and five ground robots in a leader-first follower formation. Simulation results showed a successful estimation performance with an error rate of up to five centimeters in the relative position estimations in both axes. ; Son yıllarda, otonom robotlar yangın söndürme, tarım, arama kurtarma, haritalandırma, hedef takibi ve yönelme gibi bir çok sosyal ve askeri uygulamalarda yaygınca kullanılmıştır. Görevleri verimli ve ...
BASE