SAFAR is a joint project of UniBots and MobotSoft, an academic initiative to build a Software Architecture For Agricultural Robots. UniBots, a university spin-off company based in UK, was set by Simon Blackmore to commercialise recent developments in behavioural control for outdoor mobile robots. UniBots provides a conduit from existing university designs and prototypes around the world into the market. The purpose of the SAFAR project is to develop a set of designs, tools and resources to promote the development of agricultural robots. MobotSoft is in charge of the software development using Microsoft Robotics Developer Studio as the underlying platform.
A version of the new SAFAR2 system is available to download. This new version includes a desktop application that integrates the use of Google Earth and Microsoft Robotics Developer Studio making the system much easier to use.
SAFAR2 is constantly being developed and improved to add new features, and new versions will be available for download.
Requirements and Installation
To get started, you need to install Microsoft Robotics Developer Studio 4 on your PC. It is available from the Microsoft Robotics website. Once you have installed RDS 4, run the SAFAR2 installer. You can download it here: Safar.msi
Important: Please make sure you install and run SAFAR2 under the same user account you used to install Microsoft Robotics Developer Studio, which must have administrator privileges.
The Google Earth Plugin is also required, but the first time SAFAR2 is run, it will show a “Get the Google Earth Plugin now” button if the plugin is not found in your system, as seen in the image below:
If you already have the plugin installed, maybe you need to update it to the latest version. You can see the version of the Google Earth Plugin installed in your system by opening the About Box of SAFAR2 (‘Help->About SAFAR…’ option menu). If it shows a ‘Google Earth Plugin not initialized’ message, you need to update the plugin. Uninstall it first and then restart SAFAR2.
Getting started with SAFAR2
These are the basic steps needed to create a farm with different fields, routes, objects, vehicles and run a realistic 3D simulation of it.
1 – Open SAFAR2
The disabled tabs in the main screen correspond to features that are not yet developed. They will be implemented in future versions.
2 – Choose any place on Earth where you want to create your farm
3 – Add fields to your farm
The main entity in SAFAR2 is the Farm. As a SAFAR2 user, you will open and save .farm files containing all the data of the farms you create. A farm must have at least one field, so press the “Add Field” button and start adding points with the mouse to define the field boundary with a polygon. Finish the boundary by right clicking; SAFAR2 will ask for a name for the new field. When you create a new field, the gateway is automatically placed in the first point of the boundary. You can press the ‘Move Gateway’ button later and change its position. Pick the gateway with the mouse and drag it to the desired position. Then click outside the gateway to leave the ‘gateway edition’ mode. The ‘Fields’ tab of SAFAR2 is shown in the image below.
4 – Add exclusion areas to the fields
Optionally, a field can contain Exclusion Areas. Press the “Add Exclusion Area” button to add one to the field selected in the Fields listbox.
5 – Add routes to the fields
You can add routes to a field that will be used later by the Route Following service either in simulation or with real vehicles. Press the “Add Route manually” button and start adding the waypoints that will describe the route. Each waypoint has properties that you can modify later by pressing the ‘Edit Route’ button. These properties are:
> The waypoint number.
> Northing, in UTM coordinates.
> Easting, in UTM coordinates.
> Tolerance, in meters. It is the distance at which the waypoint will be considered reached.
> Speed, at which the robot should move until it reaches the waypoint.
> Navigation Mode. Possible values are ‘STWP’ (Straight to Waypoint) and ‘MXTE’ (Minimize Cross Track Error). The navigation algorithm is explained below
> Task. For now, the possible values are “Lift Cutters” and “Lower Cutters”. You can leave it empty too.
When drawing a route, you can add waypoints with “Lower Cutters” Task and “MXTE” Navigation Mode by holding the SHIFT key down when clicking. Otherwise, the waypoints are added with “Lift Cutters” Task and “STWP” Navigation Mode. Future versions will include a ‘Route Planner’ that will generate a route automatically based on the geometry of the field, exclusion areas, robot configuration and other configurable parameters.
6 – Add vehicles to your farm
Go to the ‘Vehicles’ tab. You will see a list of the vehicle models available in SAFAR2. Select one of them and press the ‘Add->’ button to add a vehicle to your farm. You can change the default values for each model by pressing the ‘Change default config’ button. These settings are saved at SAFAR2 application level. Press the ‘Edit vehicle config’ button to change the parameters of the vehicles you added to the farm. These values will be saved in your .farm file. For now, only the hardware parameters of real vehicles are configurable. You don’t need to edit them for simulation.
7 – Add simulation objects to your fields
Go to the ‘Simulation Objects’ tab and add trees, rocks, ponds and posts to the field selected. They will appear in the map as an icon related to the type of object. The ‘Simulation objects’ tab is shown in the following image.
8 – Configure the simulation
Go to the ‘Run Simulation’ tab and configure the settings that will be used to run the MRDS simulation:
Attached camera: You can configure the relative position and orientation of the camera attached to the vehicle, as well as the width and height of the image.
Initial View: Initial position of the main camera.
Route Plan: Configure how the Route Plan will be shown in the simulation. You can check/uncheck options for showing ‘Waypoints’ and ‘Route’ (lines joining the waypoints). Also, you can choose how the vehicle trajectory will be shown. Options are “Area Covered”, “Vehicle Track” or “None”.
Terrain: Choose the kind of surface you want: ‘Ground Plane’ or ‘Height Field’. The second option will build a 3D terrain using the altitude values read from the Google Earth Plugin. If you select ‘Ground Plane’, you can choose the kind of image that will be mapped on top of the terrain: the Google Earth snapshot or a tiled picture.
Vehicle: Finally, choose the vehicle you want to simulate
9 – Start the simulation
Press the ‘Start MRDS Simulation’ button and after the generation of several files, the MRDS Visual Simulation Enviroment will start. It will also create two windows: the Dashboard, and the Route Following. Use the Dashboard to drive the simulated vehicle manually (press de ‘Drive’ button first).
The Route Following service will show the routes you have created. Choose one and press the ‘Apply’ button. The route will be drawn in the MRDS Visual Environment after a few seconds. Go to the ‘Control’ tab of the Route Following window and press the ‘Play’ button. The vehicle will start to follow the route.
For now, it is not possible to restart the simulation. Once the MRDS simulation has started the ‘Start MRDS Simulation’ button is disabled and SAFAR2 must be exited and started again. When the simulator is running, you can move the camera viewpoint by dragging the mouse pointer across the screen. It does not change the camera position but it changes the point that the camera is looking at.
If you press the Shift key with these keys, the movement keys move 20 times faster. You can use F8 to quickly switch between the main camera and the camera attached to the vehicle. The image below shows a simulation of Tomi in a realistic 3D environment.
When a real vehicle is available, you simply go to the ‘Run real-world’ tab, press the ‘Start control of real vehicle’ button and the robot is ready to be controlled via the ‘Dashboard’ or the ‘Route Following’ services, the same way as in simulation mode.
Route Following algorithm
As described in the Routes explanation, the Route Following has two different modes to navigate towards the next waypoint:
‘Straight to Waypoint’: This mode drives the vehicle directly to the waypoint. It uses the current location estimate and the waypoint coordinates to compute the desired heading and generates the control command according to the following equation of a proportional controller:
steering_angle = Kp * heading_error
where Kp is the Proportional Gain parameter that appears in the Config tab of the Route Following window.
‘Minimize Cross Track Error’: This is an implementation of a typical follow-the-carrot algorithm. In this mode the vehicle goes towards a point in the line segment between the previous waypoint and the next waypoint, minimizing thus the crosstrack error (XTE). This carrot position lies a lookahead distance away from the vehicle center and it is updated every cycle of the algorithm. The steering angle is computed by the above-mentioned equation using this moving target or carrot. The look-ahead distance is the configurable parameter in the Route Following window.
The last parameter of the config tab, the “Rotate Scale Factor”, is the percentage of the waypoint speed at which the vehicle will move when it turns with maximum steering angle. For example, if you set this value to 0.4 then, when the robot turns with maximum steering angle, the speed is reduced to 40% of the speed in straight line. This improves the control and helps to avoid vehicle rollovers.
The algorithm can be tuned by changing these parameters. At each step the current XTE is computed, and the total cumulative root mean squared XTE is shown in the Control tab of the Route Following window. This way, different combination of parameter values can be compared. You can also write your own navigation algorithms using the SAFAR Scripting Engine! Click the link to see details on how to write SAFAR scripts in the IronPython language.
- The system will be tested on real vehicles soon.
- More robots of different driving configuration will be added to the simulator.
- Other type of sensor will be added.
- A SLAM algorithm will be added.
- The control algorithm will be improved as well as other algorithms will be tested, and the architecture will be extended to allow the robot to switch dynamically between planned and reactive behaviors according to the environment.
If you have questions, comments or suggestions, please send them to us via our contact form.