SDFormat base element that can include 0-N models, actors, lights, and/or worlds. A user of multiple worlds could run parallel instances of simulation, or offer selection of a world at runtime. Version number of the SDFormat specification. The world element encapsulates an entire world description including: models, scene, physics, joints, and plugins. Unique name of the world Global audio properties. Device to use for audio playback. A value of "default" will use the system's default audio device. Otherwise, specify a an audio device file" The wind tag specifies the type and properties of the wind. Linear velocity of the wind. Include resources from a URI URI to a resource, such as a model Override the name of the included model. Override the static value of the included model. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. The gravity vector in m/s^2, expressed in a coordinate frame defined by the spherical_coordinates tag. The magnetic vector in Tesla, expressed in a coordinate frame defined by the spherical_coordinates tag. The atmosphere tag specifies the type and properties of the atmosphere model. The type of the atmosphere engine. Current options are adiabatic. Defaults to adiabatic if left unspecified. Temperature at sea level in kelvins. Pressure at sea level in pascals. Temperature gradient with respect to increasing altitude at sea level in units of K/m. Set the type of projection for the camera. Valid values are "perspective" and "orthographic". Name of the tracked visual. If no name is provided, the remaining settings will be applied whenever tracking is triggered in the GUI. Minimum distance between the camera and the tracked visual. This parameter is only used if static is set to false. Maximum distance between the camera and the tracked visual. This parameter is only used if static is set to false. If set to true, the position of the camera is fixed relatively to the model or to the world, depending on the value of the use_model_frame element. Otherwise, the position of the camera may vary but the distance between the camera and the model will depend on the value of the min_dist and max_dist elements. In any case, the camera will always follow the model by changing its orientation. If set to true, the position of the camera is relative to the model reference frame, which means that its position relative to the model will not change. Otherwise, the position of the camera is relative to the world reference frame, which means that its position relative to the world will not change. This parameter is only used if static is set to true. The position of the camera's reference frame. This parameter is only used if static is set to true. If use_model_frame is set to true, the position is relative to the model reference frame, otherwise it represents world coordinates. If set to true, the camera will inherit the yaw rotation of the tracked model. This parameter is only used if static and use_model_frame are set to true. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. The physics tag specifies the type and properties of the dynamics engine. The name of this set of physics parameters. If true, this physics element is set as the default physics profile for the world. If multiple default physics elements exist, the first element marked as default is chosen. If no default physics element exists, the first physics element is chosen. The type of the dynamics engine. Current options are ode, bullet, simbody and dart. Defaults to ode if left unspecified. Maximum time step size at which every system in simulation can interact with the states of the world. (was physics.sdf's dt). target simulation speedup factor, defined by ratio of simulation time to real-time. Rate at which to update the physics engine (UpdatePhysics calls per real-time second). (was physics.sdf's update_rate). Maximum number of contacts allowed between two entities. This value can be over ridden by a max_contacts element in a collision element. DART specific physics properties One of the following types: pgs, dantzig. PGS stands for Projected Gauss-Seidel. Specify collision detector for DART to use. Can be dart, fcl, bullet or ode. Simbody specific physics properties (Currently not used in simbody) The time duration which advances with each iteration of the dynamics engine, this has to be no bigger than max_step_size under physics block. If left unspecified, min_step_size defaults to max_step_size. Roughly the relative error of the system. -LOG(accuracy) is roughly the number of significant digits. Tolerable "slip" velocity allowed by the solver when static friction is supposed to hold object in place. Relationship among dissipation, coef. restitution, etc. d = dissipation coefficient (1/velocity) vc = capture velocity (velocity where e=e_max) vp = plastic velocity (smallest v where e=e_min) > vc Assume real COR=1 when v=0. e_min = given minimum COR, at v >= vp (a.k.a. plastic_coef_restitution) d = slope = (1-e_min)/vp OR, e_min = 1 - d*vp e_max = maximum COR = 1-d*vc, reached at v=vc e = 0, v <= vc = 1 - d*v, vc < v < vp = e_min, v >= vp dissipation factor = d*min(v,vp) [compliant] cor = e [rigid] Combining rule e = 0, e1==e2==0 = 2*e1*e2/(e1+e2), otherwise Default contact material stiffness (force/dist or torque/radian). dissipation coefficient to be used in compliant contact; if not given it is (1-min_cor)/plastic_impact_velocity this is the COR to be used at high velocities for rigid impacts; if not given it is 1 - dissipation*plastic_impact_velocity smallest impact velocity at which min COR is reached; set to zero if you want the min COR always to be used static friction (mu_s) as described by this plot: http://gazebosim.org/wiki/File:Stribeck_friction.png dynamic friction (mu_d) as described by this plot: http://gazebosim.org/wiki/File:Stribeck_friction.png viscous friction (mu_v) with units of (1/velocity) as described by this plot: http://gazebosim.org/wiki/File:Stribeck_friction.png for rigid impacts only, impact velocity at which COR is set to zero; normally inherited from global default but can be overridden here. Combining rule: use larger velocity This is the largest slip velocity at which we'll consider a transition to stiction. Normally inherited from a global default setting. For a continuous friction model this is the velocity at which the max static friction force is reached. Combining rule: use larger velocity Bullet specific physics properties One of the following types: sequential_impulse only. The time duration which advances with each iteration of the dynamics engine, this has to be no bigger than max_step_size under physics block. If left unspecified, min_step_size defaults to max_step_size. Number of iterations for each step. A higher number produces greater accuracy at a performance cost. Set the successive over-relaxation parameter. Bullet constraint parameters. Constraint force mixing parameter. See the ODE page for more information. Error reduction parameter. See the ODE page for more information. The depth of the surface layer around all geometry objects. Contacts are allowed to sink into the surface layer up to the given depth before coming to rest. The default value is zero. Increasing this to some small value (e.g. 0.001) can help prevent jittering problems due to contacts being repeatedly made and broken. Similar to ODE's max_vel implementation. See http://web.archive.org/web/20120430155635/http://bulletphysics.org/mediawiki-1.5.8/index.php/BtContactSolverInfo#Split_Impulse for more information. Similar to ODE's max_vel implementation. See http://web.archive.org/web/20120430155635/http://bulletphysics.org/mediawiki-1.5.8/index.php/BtContactSolverInfo#Split_Impulse for more information. ODE specific physics properties One of the following types: world, quick The time duration which advances with each iteration of the dynamics engine, this has to be no bigger than max_step_size under physics block. If left unspecified, min_step_size defaults to max_step_size. Number of threads to use for "islands" of disconnected models. Number of iterations for each step. A higher number produces greater accuracy at a performance cost. Experimental parameter. Set the successive over-relaxation parameter. Flag to use threading to speed up position correction computation. Flag to enable dynamic rescaling of moment of inertia in constrained directions. See gazebo pull request 1114 for the implementation of this feature. https://bitbucket.org/osrf/gazebo/pull-request/1114 Name of ODE friction model to use. Valid values include: pyramid_model: (default) friction forces limited in two directions in proportion to normal force. box_model: friction forces limited to constant in two directions. cone_model: friction force magnitude limited in proportion to normal force. See gazebo pull request 1522 for the implementation of this feature. https://bitbucket.org/osrf/gazebo/pull-request/1522 https://bitbucket.org/osrf/gazebo/commits/8c05ad64967c ODE constraint parameters. Constraint force mixing parameter. See the ODE page for more information. Error reduction parameter. See the ODE page for more information. The maximum correcting velocities allowed when resolving contacts. The depth of the surface layer around all geometry objects. Contacts are allowed to sink into the surface layer up to the given depth before coming to rest. The default value is zero. Increasing this to some small value (e.g. 0.001) can help prevent jittering problems due to contacts being repeatedly made and broken. Specifies the look of the environment. Color of the ambient light. Color of the background. Properties for the sky Time of day [0..24] Sunrise time [0..24] Sunset time [0..24] Sunset time [0..24] Speed of the clouds Direction of the cloud movement Density of clouds Average size of the clouds Ambient cloud color Enable/disable shadows Controls fog Fog color Fog type: constant, linear, quadratic Distance to start of fog Distance to end of fog Density of fog Enable/disable the grid Show/hide world origin indicator The light element describes a light source. A unique name for the light. The light type: point, directional, spot. When true, the light will cast shadows. Diffuse light color Specular light color Light attenuation Range of the light The linear attenuation factor: 1 means attenuate evenly over the distance. The constant attenuation factor: 1.0 means never attenuate, 0.0 is complete attenutation. The quadratic attenuation factor: adds a curvature to the attenuation. Direction of the light, only applicable for spot and directional lights. Spot light parameters Angle covered by the bright inner cone Angle covered by the outer cone The rate of falloff between the inner and outer cones. 1.0 means a linear falloff, less means slower falloff, higher means faster falloff. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. The model element defines a complete robot or any other physical object. A unique name for the model. This name must not match another model in the world. If set to true, the model is immovable. Otherwise the model is simulated in the dynamics engine. If set to true, all links in the model will collide with each other (except those connected by a joint). Can be overridden by the link or collision element self_collide property. Two links within a model will collide if link1.self_collide OR link2.self_collide. Links connected by a joint will never collide. Allows a model to auto-disable, which is means the physics engine can skip updating the model when the model is at rest. This parameter is only used by models with no joints. Include resources from a URI. This can be used to nest models. URI to a resource, such as a model Override the pose of the included model. A position and orientation in the global coordinate frame for the model. Position(x,y,z) and rotation (roll, pitch yaw) in the global coordinate frame. Override the name of the included model. Override the static value of the included model. A nested model element A unique name for the model. This name must not match another nested model in the same level as this model. If set to true, all links in the model will be affected by the wind. Can be overriden by the link wind property. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A physical link with inertia, collision, and visual properties. A link must be a child of a model, and any number of links may exist in a model. A unique name for the link within the scope of the model. If true, the link is affected by gravity. If true, the link is affected by the wind. If true, the link can collide with other links in the model. Two links within a model will collide if link1.self_collide OR link2.self_collide. Links connected by a joint will never collide. If true, the link is kinematic only If true, the link will have 6DOF and be a direct child of world. Exponential damping of the link's velocity. Linear damping Angular damping A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. The inertial properties of the link. The mass of the link. The 3x3 rotational inertia matrix. Because the rotational inertia matrix is symmetric, only 6 above-diagonal elements of this matrix are specified here, using the attributes ixx, ixy, ixz, iyy, iyz, izz. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. This is the pose of the inertial reference frame, relative to the specified reference frame. The origin of the inertial reference frame needs to be at the center of gravity. The axes of the inertial reference frame do not need to be aligned with the principal axes of the inertia. Name of frame which the pose is defined relative to. The collision properties of a link. Note that this can be different from the visual properties of a link, for example, simpler collision models are often used to reduce computation time. Unique name for the collision element within the scope of the parent link. intensity value returned by laser sensor. Maximum number of contacts allowed between two entities. This value overrides the max_contacts element defined in physics. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. The shape of the visual or collision object. You can use the empty tag to make empty geometries. Box shape The three side lengths of the box. The origin of the box is in its geometric center (inside the center of the box). Cylinder shape Radius of the cylinder Length of the cylinder A heightmap based on a 2d grayscale image. URI to a grayscale image file The size of the heightmap in world units. When loading an image: "size" is used if present, otherwise defaults to 1x1x1. When loading a DEM: "size" is used if present, otherwise defaults to true size of DEM. A position offset. The heightmap can contain multiple textures. The order of the texture matters. The first texture will appear at the lowest height, and the last texture at the highest height. Use blend to control the height thresholds and fade between textures. Size of the applied texture in meters. Diffuse texture image filename Normalmap texture image filename The blend tag controls how two adjacent textures are mixed. The number of blend elements should equal one less than the number of textures. Min height of a blend layer Distance over which the blend occurs Set if the rendering engine will use terrain paging Samples per heightmap datum. For rasterized heightmaps, this indicates the number of samples to take per pixel. Using a lower value, e.g. 1, will generally improve the performance of the heightmap but lower the heightmap quality. Extrude a set of boxes from a grayscale image. URI of the grayscale image file Scaling factor applied to the image Grayscale threshold Height of the extruded boxes The amount of error in the model Mesh shape Mesh uri Use a named submesh. The submesh must exist in the mesh specified by the uri Name of the submesh within the parent mesh Set to true to center the vertices of the submesh at 0,0,0. This will effectively remove any transformations on the submesh before the poses from parent links and models are applied. Scaling factor applied to the mesh Plane shape Normal direction for the plane. When a Plane is used as a geometry for a Visual or Collision object, then the normal is specified in the Visual or Collision frame, respectively. Length of each side of the plane. Note that this property is meaningful only for visualizing the Plane, i.e., when the Plane is used as a geometry for a Visual object. The Plane has infinite size when used as a geometry for a Collision object. Defines an extruded polyline shape A series of points that define the path of the polyline. Height of the polyline Sphere shape radius of the sphere The surface parameters Bounciness coefficient of restitution, from [0...1], where 0=no bounciness. Bounce capture velocity, below which effective coefficient of restitution is 0. Parameters for torsional friction Torsional friction coefficient, unitless maximum ratio of tangential stress to normal stress. If this flag is true, torsional friction is calculated using the "patch_radius" parameter. If this flag is set to false, "surface_radius" (R) and contact depth (d) are used to compute the patch radius as sqrt(R*d). Radius of contact patch surface. Surface radius on the point of contact. Torsional friction parameters for ODE Force dependent slip for torsional friction, equivalent to inverse of viscous damping coefficient with units of rad/s/(Nm). A slip value of 0 is infinitely viscous. ODE friction parameters Coefficient of friction in first friction pyramid direction, the unitless maximum ratio of force in first friction pyramid direction to normal force. Coefficient of friction in second friction pyramid direction, the unitless maximum ratio of force in second friction pyramid direction to normal force. Unit vector specifying first friction pyramid direction in collision-fixed reference frame. If the friction pyramid model is in use, and this value is set to a unit vector for one of the colliding surfaces, the ODE Collide callback function will align the friction pyramid directions with a reference frame fixed to that collision surface. If both surfaces have this value set to a vector of zeros, the friction pyramid directions will be aligned with the world frame. If this value is set for both surfaces, the behavior is undefined. Force dependent slip in first friction pyramid direction, equivalent to inverse of viscous damping coefficient with units of m/s/N. A slip value of 0 is infinitely viscous. Force dependent slip in second friction pyramid direction, equivalent to inverse of viscous damping coefficient with units of m/s/N. A slip value of 0 is infinitely viscous. Coefficient of friction in first friction pyramid direction, the unitless maximum ratio of force in first friction pyramid direction to normal force. Coefficient of friction in second friction pyramid direction, the unitless maximum ratio of force in second friction pyramid direction to normal force. Unit vector specifying first friction pyramid direction in collision-fixed reference frame. If the friction pyramid model is in use, and this value is set to a unit vector for one of the colliding surfaces, the friction pyramid directions will be aligned with a reference frame fixed to that collision surface. If both surfaces have this value set to a vector of zeros, the friction pyramid directions will be aligned with the world frame. If this value is set for both surfaces, the behavior is undefined. Coefficient of rolling friction Flag to disable contact force generation, while still allowing collision checks and contact visualization to occur. Bitmask for collision filtering when collide_without_contact is on Bitmask for collision filtering. This will override collide_without_contact Bitmask for category of collision filtering. Collision happens if ((category1 & collision2) | (category2 & collision1)) is not zero. If not specified, the category_bitmask should be interpreted as being the same as collide_bitmask. Poisson's ratio is the unitless ratio between transverse and axial strain. This value must lie between (-1, 0.5). Defaults to 0.3 for typical steel. Note typical silicone elastomers have Poisson's ratio near 0.49 ~ 0.50. For reference, approximate values for Material:(Young's Modulus, Poisson's Ratio) for some of the typical materials are: Plastic: (1e8 ~ 3e9 Pa, 0.35 ~ 0.41), Wood: (4e9 ~ 1e10 Pa, 0.22 ~ 0.50), Aluminum: (7e10 Pa, 0.32 ~ 0.35), Steel: (2e11 Pa, 0.26 ~ 0.31). Young's Modulus in SI derived unit Pascal. Defaults to -1. If value is less or equal to zero, contact using elastic modulus (with Poisson's Ratio) is disabled. For reference, approximate values for Material:(Young's Modulus, Poisson's Ratio) for some of the typical materials are: Plastic: (1e8 ~ 3e9 Pa, 0.35 ~ 0.41), Wood: (4e9 ~ 1e10 Pa, 0.22 ~ 0.50), Aluminum: (7e10 Pa, 0.32 ~ 0.35), Steel: (2e11 Pa, 0.26 ~ 0.31). ODE contact parameters Soft constraint force mixing. Soft error reduction parameter dynamically "stiffness"-equivalent coefficient for contact joints dynamically "damping"-equivalent coefficient for contact joints maximum contact correction velocity truncation term. minimum allowable depth before contact correction impulse is applied Bullet contact parameters Soft constraint force mixing. Soft error reduction parameter dynamically "stiffness"-equivalent coefficient for contact joints dynamically "damping"-equivalent coefficient for contact joints Similar to ODE's max_vel implementation. See http://bulletphysics.org/mediawiki-1.5.8/index.php/BtContactSolverInfo#Split_Impulse for more information. Similar to ODE's max_vel implementation. See http://bulletphysics.org/mediawiki-1.5.8/index.php/BtContactSolverInfo#Split_Impulse for more information. soft contact pamameters based on paper: http://www.cc.gatech.edu/graphics/projects/Sumit/homepage/papers/sigasia11/jain_softcontacts_siga11.pdf This is variable k_v in the soft contacts paper. Its unit is N/m. This is variable k_e in the soft contacts paper. Its unit is N/m. Viscous damping of point velocity in body frame. Its unit is N/m/s. Fraction of mass to be distributed among deformable nodes. The visual properties of the link. This element specifies the shape of the object (box, cylinder, etc.) for visualization purposes. Unique name for the visual element within the scope of the parent link. If true the visual will cast shadows. will be implemented in the future release. The amount of transparency( 0=opaque, 1 = fully transparent) Optional meta information for the visual. The information contained within this element should be used to provide additional feedback to an end user. The layer in which this visual is displayed. The layer number is useful for programs, such as Gazebo, that put visuals in different layers for enhanced visualization. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. The material of the visual element. Name of material from an installed script file. This will override the color element if the script exists. URI of the material script file Name of the script within the script file vertex, pixel, normal_map_objectspace, normal_map_tangentspace filename of the normal map If false, dynamic lighting will be disabled The ambient color of a material specified by set of four numbers representing red/green/blue, each in the range of [0,1]. The diffuse color of a material specified by set of four numbers representing red/green/blue/alpha, each in the range of [0,1]. The specular color of a material specified by set of four numbers representing red/green/blue/alpha, each in the range of [0,1]. The emissive color of a material specified by set of four numbers representing red/green/blue, each in the range of [0,1]. Physically Based Rendering (PBR) material. There are two PBR workflows: metal and specular. While both workflows and their parameters can be specified at the same time, typically only one of them will be used (depending on the underlying renderer capability). It is also recommended to use the same workflow for all materials in the world. PBR using the Metallic/Roughness workflow. Filename of the diffuse/albedo map. Filename of the roughness map. Material roughness in the range of [0,1], where 0 represents a smooth surface and 1 represents a rough surface. This is the inverse of a specular map in a PBR specular workflow. Filename of the metalness map. Material metalness in the range of [0,1], where 0 represents non-metal and 1 represents raw metal Filename of the environment / reflection map, typically in the form of a cubemap Filename of the ambient occlusion map. The map defines the amount of ambient lighting on the surface. Filename of the normal map. The normals can be in the object space or tangent space as specified in the 'type' attribute The space that the normals are in. Values are: 'object' or 'tangent' Filename of the emissive map. PBR using the Specular/Glossiness workflow. Filename of the diffuse/albedo map. Filename of the specular map. Filename of the glossiness map. Material glossiness in the range of [0-1], where 0 represents a rough surface and 1 represents a smooth surface. This is the inverse of a roughness map in a PBR metal workflow. Filename of the ambient occlusion map. The map defines the amount of ambient lighting on the surface. Filename of the normal map. The normals can be in the object space or tangent space as specified in the 'type' attribute The space that the normals are in. Values are: 'object' or 'tangent' Filename of the emissive map. The shape of the visual or collision object. You can use the empty tag to make empty geometries. Box shape The three side lengths of the box. The origin of the box is in its geometric center (inside the center of the box). Cylinder shape Radius of the cylinder Length of the cylinder A heightmap based on a 2d grayscale image. URI to a grayscale image file The size of the heightmap in world units. When loading an image: "size" is used if present, otherwise defaults to 1x1x1. When loading a DEM: "size" is used if present, otherwise defaults to true size of DEM. A position offset. The heightmap can contain multiple textures. The order of the texture matters. The first texture will appear at the lowest height, and the last texture at the highest height. Use blend to control the height thresholds and fade between textures. Size of the applied texture in meters. Diffuse texture image filename Normalmap texture image filename The blend tag controls how two adjacent textures are mixed. The number of blend elements should equal one less than the number of textures. Min height of a blend layer Distance over which the blend occurs Set if the rendering engine will use terrain paging Samples per heightmap datum. For rasterized heightmaps, this indicates the number of samples to take per pixel. Using a lower value, e.g. 1, will generally improve the performance of the heightmap but lower the heightmap quality. Extrude a set of boxes from a grayscale image. URI of the grayscale image file Scaling factor applied to the image Grayscale threshold Height of the extruded boxes The amount of error in the model Mesh shape Mesh uri Use a named submesh. The submesh must exist in the mesh specified by the uri Name of the submesh within the parent mesh Set to true to center the vertices of the submesh at 0,0,0. This will effectively remove any transformations on the submesh before the poses from parent links and models are applied. Scaling factor applied to the mesh Plane shape Normal direction for the plane. When a Plane is used as a geometry for a Visual or Collision object, then the normal is specified in the Visual or Collision frame, respectively. Length of each side of the plane. Note that this property is meaningful only for visualizing the Plane, i.e., when the Plane is used as a geometry for a Visual object. The Plane has infinite size when used as a geometry for a Collision object. Defines an extruded polyline shape A series of points that define the path of the polyline. Height of the polyline Sphere shape radius of the sphere A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. The sensor tag describes the type and properties of a sensor. A unique name for the sensor. This name must not match another model in the model. The type name of the sensor. By default, SDFormat supports types air_pressure, altimeter, camera, contact, depth_camera, force_torque, gps, gpu_lidar, gpu_ray, imu, lidar, logical_camera, magnetometer, multicamera, ray, rfid, rfidtag, rgbd_camera, sonar, thermal_camera, wireless_receiver, and wireless_transmitter. The "ray" and "gpu_ray" types are equivalent to "lidar" and "gpu_lidar", respectively. It is preferred to use "lidar" and "gpu_lidar" since "ray" and "gpu_ray" will be deprecated. The "ray" and "gpu_ray" types are maintained for legacy support. If true the sensor will always be updated according to the update rate. The frequency at which the sensor data is generated. If left unspecified, the sensor will generate data every cycle. If true, the sensor is visualized in the GUI Name of the topic on which data is published. This is necessary for visualization A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. These elements are specific to an air pressure sensor. The initial altitude in meters. This value can be used by a sensor implementation to augment the altitude of the sensor. For example, if you are using simulation instead of creating a 1000 m mountain model on which to place your sensor, you could instead set this value to 1000 and place your model on a ground plane with a Z height of zero. Noise parameters for the pressure data. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to an altimeter sensor. Noise parameters for vertical position The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical velocity The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to camera sensors. An optional name for the camera. Horizontal field of view The image size in pixels and format. Width in pixels Height in pixels (L8|R8G8B8|B8G8R8|BAYER_RGGB8|BAYER_BGGR8|BAYER_GBRG8|BAYER_GRBG8) The near and far clip planes. Objects closer or farther than these planes are not rendered. Near clipping plane Far clipping plane Enable or disable saving of camera frames. True = saving enabled The path name which will hold the frame data. If path name is relative, then directory is relative to current working directory. Depth camera parameters Type of output The near and far clip planes. Objects closer or farther than these planes are not detected by the depth camera. Near clipping plane for depth camera Far clipping plane for depth camera The properties of the noise model that should be applied to generated images The type of noise. Currently supported types are: "gaussian" (draw additive noise values independently for each pixel from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. Lens distortion to be applied to camera images. See http://en.wikipedia.org/wiki/Distortion_(optics)#Software_correction The radial distortion coefficient k1 The radial distortion coefficient k2 The radial distortion coefficient k3 The tangential distortion coefficient p1 The tangential distortion coefficient p2 The distortion center or principal point Lens projection description Type of the lens mapping. Supported values are gnomonical, stereographic, equidistant, equisolid_angle, orthographic, custom. For gnomonical (perspective) projection, it is recommended to specify a horizontal_fov of less than or equal to 90° If true the image will be scaled to fit horizontal FOV, otherwise it will be shown according to projection type parameters Definition of custom mapping function in a form of r=c1*f*fun(theta/c2 + c3). See https://en.wikipedia.org/wiki/Fisheye_lens#Mapping_function Linear scaling constant Angle scaling constant Angle offset constant Focal length of the optical system. Note: It's not a focal length of the lens in a common sense! This value is ignored if 'scale_to_fov' is set to true Possible values are 'sin', 'tan' and 'id' Everything outside of the specified angle will be hidden, 90° by default Resolution of the environment cube map used to draw the world Camera intrinsic parameters for setting a custom perspective projection matrix (cannot be used with WideAngleCamera since this class uses image stitching from 6 different cameras for achieving a wide field of view). The focal lengths can be computed using focal_length_in_pixels = (image_width_in_pixels * 0.5) / tan(field_of_view_in_degrees * 0.5 * PI/180) X focal length (in pixels, overrides horizontal_fov) Y focal length (in pixels, overrides horizontal_fov) X principal point (in pixels) Y principal point (in pixels) XY axis skew A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. These elements are specific to the contact sensor. name of the collision element within a link that acts as the contact sensor. Topic on which contact data is published. These elements are specific to the GPS sensor. Parameters related to GPS position measurement. Noise parameters for horizontal position measurement, in units of meters. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical position measurement, in units of meters. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to GPS position measurement. Noise parameters for horizontal velocity measurement, in units of meters/second. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical velocity measurement, in units of meters/second. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the IMU sensor. This string represents special hardcoded use cases that are commonly seen with typical robot IMU's: - CUSTOM: use Euler angle custom_rpy orientation specification. The orientation of the IMU's reference frame is defined by adding the custom_rpy rotation to the parent_frame. - NED: The IMU XYZ aligns with NED, where NED orientation relative to Gazebo world is defined by the SphericalCoordinates class. - ENU: The IMU XYZ aligns with ENU, where ENU orientation relative to Gazebo world is defined by the SphericalCoordinates class. - NWU: The IMU XYZ aligns with NWU, where NWU orientation relative to Gazebo world is defined by the SphericalCoordinates class. - GRAV_UP: where direction of gravity maps to IMU reference frame Z-axis with Z-axis pointing in the opposite direction of gravity. IMU reference frame X-axis direction is defined by grav_dir_x. Note if grav_dir_x is parallel to gravity direction, this configuration fails. Otherwise, IMU reference frame X-axis is defined by projection of grav_dir_x onto a plane normal to the gravity vector. IMU reference frame Y-axis is a vector orthogonal to both X and Z axis following the right hand rule. - GRAV_DOWN: where direction of gravity maps to IMU reference frame Z-axis with Z-axis pointing in the direction of gravity. IMU reference frame X-axis direction is defined by grav_dir_x. Note if grav_dir_x is parallel to gravity direction, this configuration fails. Otherwise, IMU reference frame X-axis is defined by projection of grav_dir_x onto a plane normal to the gravity vector. IMU reference frame Y-axis is a vector orthogonal to both X and Z axis following the right hand rule. This field and parent_frame are used when localization is set to CUSTOM. Orientation (fixed axis roll, pitch yaw) transform from parent_frame to this IMU's reference frame. Some common examples are: - IMU reports in its local frame on boot. IMU sensor frame is the reference frame. Example: parent_frame="", custom_rpy="0 0 0" - IMU reports in Gazebo world frame. Example sdf: parent_frame="world", custom_rpy="0 0 0" - IMU reports in NWU frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between North-West-Up and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="0 0 0" - IMU reports in NED frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between North-East-Down and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="M_PI 0 0" - IMU reports in ENU frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between East-North-Up and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="0 0 -0.5*M_PI" - IMU reports in ROS optical frame as described in http://www.ros.org/reps/rep-0103.html#suffix-frames, which is (z-forward, x-left to right when facing +z, y-top to bottom when facing +z). (default gazebo camera is +x:view direction, +y:left, +z:up). Example sdf: parent_frame="local", custom_rpy="-0.5*M_PI 0 -0.5*M_PI" Name of parent frame which the custom_rpy transform is defined relative to. It can be any valid fully scoped Gazebo Link name or the special reserved "world" frame. If left empty, use the sensor's own local frame. Used when localization is set to GRAV_UP or GRAV_DOWN, a projection of this vector into a plane that is orthogonal to the gravity vector defines the direction of the IMU reference frame's X-axis. grav_dir_x is defined in the coordinate frame as defined by the parent_frame element. Name of parent frame in which the grav_dir_x vector is defined. It can be any valid fully scoped Gazebo Link name or the special reserved "world" frame. If left empty, use the sensor's own local frame. Topic on which data is published. DEPRECATED. Use the topic element that is a child of the sensor element. These elements are specific to body-frame angular velocity, which is expressed in radians per second Angular velocity about the X axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Angular velocity about the Y axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Angular velocity about the Z axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to body-frame linear acceleration, which is expressed in meters per second squared Linear acceleration about the X axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Linear acceleration about the Y axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Linear acceleration about the Z axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the lidar sensor. The number of simulated lidar rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle The number of simulated lidar rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle specifies range properties of each simulated lidar The minimum distance for each lidar ray. The maximum distance for each lidar ray. Linear resolution of each lidar ray. The properties of the noise model that should be applied to generated scans The type of noise. Currently supported types are: "gaussian" (draw noise values independently for each beam from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. These elements are specific to logical camera sensors. A logical camera reports objects that fall within a frustum. Computation should be performed on the CPU. Near clipping distance of the view frustum Far clipping distance of the view frustum Aspect ratio of the near and far planes. This is the width divided by the height of the near or far planes. Horizontal field of view of the frustum, in radians. This is the angle between the frustum's vertex and the edges of the near or far plane. These elements are specific to a Magnetometer sensor. Parameters related to the body-frame X axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to the body-frame Y axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to the body-frame Z axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the ray (laser) sensor. The number of simulated rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle The number of simulated rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle specifies range properties of each simulated ray The minimum distance for each ray. The maximum distance for each ray. Linear resolution of each ray. The properties of the noise model that should be applied to generated scans The type of noise. Currently supported types are: "gaussian" (draw noise values independently for each beam from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. These elements are specific to the sonar sensor. The sonar collision shape. Currently supported geometries are: "cone" and "sphere". Minimum range Max range Radius of the sonar cone at max range. This parameter is only used if geometry is "cone". These elements are specific to a wireless transceiver. Service set identifier (network name) Specifies the frequency of transmission in MHz Only a frequency range is filtered. Here we set the lower bound (MHz). Only a frequency range is filtered. Here we set the upper bound (MHz). Specifies the antenna gain in dBi Specifies the transmission power in dBm Mininum received signal power in dBm These elements are specific to the force torque sensor. Frame in which to report the wrench values. Currently supported frames are: "parent" report the wrench expressed in the orientation of the parent link frame, "child" report the wrench expressed in the orientation of the child link frame, "sensor" report the wrench expressed in the orientation of the joint sensor frame. Note that for each option the point with respect to which the torque component of the wrench is expressed is the joint origin. Direction of the wrench measured by the sensor. The supported options are: "parent_to_child" if the measured wrench is the one applied by parent link on the child link, "child_to_parent" if the measured wrench is the one applied by the child link on the parent link. Name of the projector Texture name Field of view Near clip distance far clip distance A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. An audio sink. An audio source. URI of the audio media. Pitch for the audio media, in Hz Gain for the audio media, in dB. List of collision objects that will trigger audio playback. Name of child collision element that will trigger audio playback. True to make the audio source loop playback. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. Description of a battery. Unique name for the battery. Initial voltage in volts. The light element describes a light source. A unique name for the light. The light type: point, directional, spot. When true, the light will cast shadows. Diffuse light color Specular light color Light attenuation Range of the light The linear attenuation factor: 1 means attenuate evenly over the distance. The constant attenuation factor: 1.0 means never attenuate, 0.0 is complete attenutation. The quadratic attenuation factor: adds a curvature to the attenuation. Direction of the light, only applicable for spot and directional lights. Spot light parameters Angle covered by the bright inner cone Angle covered by the outer cone The rate of falloff between the inner and outer cones. 1.0 means a linear falloff, less means slower falloff, higher means faster falloff. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A joint connects two links with kinematic and dynamic properties. By default, the pose of a joint is expressed in the child link frame. A unique name for the joint within the scope of the model. The type of joint, which must be one of the following: (continuous) a hinge joint that rotates on a single axis with a continuous range of motion, (revolute) a hinge joint that rotates on a single axis with a fixed range of motion, (gearbox) geared revolute joints, (revolute2) same as two revolute joints connected in series, (prismatic) a sliding joint that slides along an axis with a limited range specified by upper and lower limits, (ball) a ball and socket joint, (screw) a single degree of freedom joint with coupled sliding and rotational motion, (universal) like a ball joint, but constrains one degree of freedom, (fixed) a joint with zero degrees of freedom that rigidly connects two links. Name of the parent link Name of the child link Parameter for gearbox joints. Given theta_1 and theta_2 defined in description for gearbox_reference_body, theta_2 = -gearbox_ratio * theta_1. Parameter for gearbox joints. Gearbox ratio is enforced over two joint angles. First joint angle (theta_1) is the angle from the gearbox_reference_body to the parent link in the direction of the axis element and the second joint angle (theta_2) is the angle from the gearbox_reference_body to the child link in the direction of the axis2 element. Parameter for screw joints. Parameters related to the axis of rotation for revolute joints, the axis of translation for prismatic joints. Default joint position for this joint axis. Represents the x,y,z components of the axis unit vector. The axis is expressed in the joint frame unless the use_parent_model_frame flag is set to true. The vector should be normalized. Flag to interpret the axis xyz element in the parent model frame instead of joint frame. Provided for Gazebo compatibility (see https://bitbucket.org/osrf/gazebo/issue/494 ). An element specifying physical properties of the joint. These values are used to specify modeling properties of the joint, particularly useful for simulation. The physical velocity dependent viscous damping coefficient of the joint. The physical static friction value of the joint. The spring reference position for this joint axis. The spring stiffness for this joint axis. specifies the limits of this joint Specifies the lower joint limit (radians for revolute joints, meters for prismatic joints). Omit if joint is continuous. Specifies the upper joint limit (radians for revolute joints, meters for prismatic joints). Omit if joint is continuous. A value for enforcing the maximum joint effort applied. Limit is not enforced if value is negative. A value for enforcing the maximum joint velocity. Joint stop stiffness. Joint stop dissipation. Parameters related to the second axis of rotation for revolute2 joints and universal joints. Default joint position for this joint axis. Represents the x,y,z components of the axis unit vector. The axis is expressed in the joint frame unless the use_parent_model_frame flag is set to true. The vector should be normalized. Flag to interpret the axis xyz element in the parent model frame instead of joint frame. Provided for Gazebo compatibility (see https://bitbucket.org/osrf/gazebo/issue/494 ). An element specifying physical properties of the joint. These values are used to specify modeling properties of the joint, particularly useful for simulation. The physical velocity dependent viscous damping coefficient of the joint. EXPERIMENTAL: if damping coefficient is negative and implicit_spring_damper is true, adaptive damping is used. The physical static friction value of the joint. The spring reference position for this joint axis. The spring stiffness for this joint axis. An attribute specifying the lower joint limit (radians for revolute joints, meters for prismatic joints). Omit if joint is continuous. An attribute specifying the upper joint limit (radians for revolute joints, meters for prismatic joints). Omit if joint is continuous. An attribute for enforcing the maximum joint effort applied by Joint::SetForce. Limit is not enforced if value is negative. (not implemented) An attribute for enforcing the maximum joint velocity. Joint stop stiffness. Supported physics engines: SimBody. Joint stop dissipation. Supported physics engines: SimBody. Parameters that are specific to a certain physics engine. Simbody specific parameters Force cut in the multibody graph at this joint. ODE specific parameters (DEPRECATION WARNING: In SDFormat 1.5 this tag will be replaced by the same tag directly under the physics-block. For now, this tag overrides the one outside of ode-block, but in SDFormat 1.5 this tag will be removed completely.) If provide feedback is set to true, ODE will compute the constraint forces at this joint. If cfm damping is set to true, ODE will use CFM to simulate damping, allows for infinite damping, and one additional constraint row (previously used for joint limit) is always active. If implicit_spring_damper is set to true, ODE will use CFM, ERP to simulate stiffness and damping, allows for infinite damping, and one additional constraint row (previously used for joint limit) is always active. This replaces cfm_damping parameter in SDFormat 1.4. Scale the excess for in a joint motor at joint limits. Should be between zero and one. Constraint force mixing for constrained directions Error reduction parameter for constrained directions Bounciness of the limits Maximum force or torque used to reach the desired velocity. The desired velocity of the joint. Should only be set if you want the joint to move on load. Constraint force mixing parameter used by the joint stop Error reduction parameter used by the joint stop Suspension constraint force mixing parameter Suspension error reduction parameter If provide feedback is set to true, physics engine will compute the constraint forces at this joint. For now, provide_feedback under ode block will override this tag and given user warning about the migration. provide_feedback under ode is scheduled to be removed in SDFormat 1.5. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. The sensor tag describes the type and properties of a sensor. A unique name for the sensor. This name must not match another model in the model. The type name of the sensor. By default, SDFormat supports types air_pressure, altimeter, camera, contact, depth_camera, force_torque, gps, gpu_lidar, gpu_ray, imu, lidar, logical_camera, magnetometer, multicamera, ray, rfid, rfidtag, rgbd_camera, sonar, thermal_camera, wireless_receiver, and wireless_transmitter. The "ray" and "gpu_ray" types are equivalent to "lidar" and "gpu_lidar", respectively. It is preferred to use "lidar" and "gpu_lidar" since "ray" and "gpu_ray" will be deprecated. The "ray" and "gpu_ray" types are maintained for legacy support. If true the sensor will always be updated according to the update rate. The frequency at which the sensor data is generated. If left unspecified, the sensor will generate data every cycle. If true, the sensor is visualized in the GUI Name of the topic on which data is published. This is necessary for visualization A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. These elements are specific to an air pressure sensor. The initial altitude in meters. This value can be used by a sensor implementation to augment the altitude of the sensor. For example, if you are using simulation instead of creating a 1000 m mountain model on which to place your sensor, you could instead set this value to 1000 and place your model on a ground plane with a Z height of zero. Noise parameters for the pressure data. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to an altimeter sensor. Noise parameters for vertical position The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical velocity The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to camera sensors. An optional name for the camera. Horizontal field of view The image size in pixels and format. Width in pixels Height in pixels (L8|R8G8B8|B8G8R8|BAYER_RGGB8|BAYER_BGGR8|BAYER_GBRG8|BAYER_GRBG8) The near and far clip planes. Objects closer or farther than these planes are not rendered. Near clipping plane Far clipping plane Enable or disable saving of camera frames. True = saving enabled The path name which will hold the frame data. If path name is relative, then directory is relative to current working directory. Depth camera parameters Type of output The near and far clip planes. Objects closer or farther than these planes are not detected by the depth camera. Near clipping plane for depth camera Far clipping plane for depth camera The properties of the noise model that should be applied to generated images The type of noise. Currently supported types are: "gaussian" (draw additive noise values independently for each pixel from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. Lens distortion to be applied to camera images. See http://en.wikipedia.org/wiki/Distortion_(optics)#Software_correction The radial distortion coefficient k1 The radial distortion coefficient k2 The radial distortion coefficient k3 The tangential distortion coefficient p1 The tangential distortion coefficient p2 The distortion center or principal point Lens projection description Type of the lens mapping. Supported values are gnomonical, stereographic, equidistant, equisolid_angle, orthographic, custom. For gnomonical (perspective) projection, it is recommended to specify a horizontal_fov of less than or equal to 90° If true the image will be scaled to fit horizontal FOV, otherwise it will be shown according to projection type parameters Definition of custom mapping function in a form of r=c1*f*fun(theta/c2 + c3). See https://en.wikipedia.org/wiki/Fisheye_lens#Mapping_function Linear scaling constant Angle scaling constant Angle offset constant Focal length of the optical system. Note: It's not a focal length of the lens in a common sense! This value is ignored if 'scale_to_fov' is set to true Possible values are 'sin', 'tan' and 'id' Everything outside of the specified angle will be hidden, 90° by default Resolution of the environment cube map used to draw the world Camera intrinsic parameters for setting a custom perspective projection matrix (cannot be used with WideAngleCamera since this class uses image stitching from 6 different cameras for achieving a wide field of view). The focal lengths can be computed using focal_length_in_pixels = (image_width_in_pixels * 0.5) / tan(field_of_view_in_degrees * 0.5 * PI/180) X focal length (in pixels, overrides horizontal_fov) Y focal length (in pixels, overrides horizontal_fov) X principal point (in pixels) Y principal point (in pixels) XY axis skew A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. These elements are specific to the contact sensor. name of the collision element within a link that acts as the contact sensor. Topic on which contact data is published. These elements are specific to the GPS sensor. Parameters related to GPS position measurement. Noise parameters for horizontal position measurement, in units of meters. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical position measurement, in units of meters. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to GPS position measurement. Noise parameters for horizontal velocity measurement, in units of meters/second. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical velocity measurement, in units of meters/second. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the IMU sensor. This string represents special hardcoded use cases that are commonly seen with typical robot IMU's: - CUSTOM: use Euler angle custom_rpy orientation specification. The orientation of the IMU's reference frame is defined by adding the custom_rpy rotation to the parent_frame. - NED: The IMU XYZ aligns with NED, where NED orientation relative to Gazebo world is defined by the SphericalCoordinates class. - ENU: The IMU XYZ aligns with ENU, where ENU orientation relative to Gazebo world is defined by the SphericalCoordinates class. - NWU: The IMU XYZ aligns with NWU, where NWU orientation relative to Gazebo world is defined by the SphericalCoordinates class. - GRAV_UP: where direction of gravity maps to IMU reference frame Z-axis with Z-axis pointing in the opposite direction of gravity. IMU reference frame X-axis direction is defined by grav_dir_x. Note if grav_dir_x is parallel to gravity direction, this configuration fails. Otherwise, IMU reference frame X-axis is defined by projection of grav_dir_x onto a plane normal to the gravity vector. IMU reference frame Y-axis is a vector orthogonal to both X and Z axis following the right hand rule. - GRAV_DOWN: where direction of gravity maps to IMU reference frame Z-axis with Z-axis pointing in the direction of gravity. IMU reference frame X-axis direction is defined by grav_dir_x. Note if grav_dir_x is parallel to gravity direction, this configuration fails. Otherwise, IMU reference frame X-axis is defined by projection of grav_dir_x onto a plane normal to the gravity vector. IMU reference frame Y-axis is a vector orthogonal to both X and Z axis following the right hand rule. This field and parent_frame are used when localization is set to CUSTOM. Orientation (fixed axis roll, pitch yaw) transform from parent_frame to this IMU's reference frame. Some common examples are: - IMU reports in its local frame on boot. IMU sensor frame is the reference frame. Example: parent_frame="", custom_rpy="0 0 0" - IMU reports in Gazebo world frame. Example sdf: parent_frame="world", custom_rpy="0 0 0" - IMU reports in NWU frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between North-West-Up and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="0 0 0" - IMU reports in NED frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between North-East-Down and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="M_PI 0 0" - IMU reports in ENU frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between East-North-Up and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="0 0 -0.5*M_PI" - IMU reports in ROS optical frame as described in http://www.ros.org/reps/rep-0103.html#suffix-frames, which is (z-forward, x-left to right when facing +z, y-top to bottom when facing +z). (default gazebo camera is +x:view direction, +y:left, +z:up). Example sdf: parent_frame="local", custom_rpy="-0.5*M_PI 0 -0.5*M_PI" Name of parent frame which the custom_rpy transform is defined relative to. It can be any valid fully scoped Gazebo Link name or the special reserved "world" frame. If left empty, use the sensor's own local frame. Used when localization is set to GRAV_UP or GRAV_DOWN, a projection of this vector into a plane that is orthogonal to the gravity vector defines the direction of the IMU reference frame's X-axis. grav_dir_x is defined in the coordinate frame as defined by the parent_frame element. Name of parent frame in which the grav_dir_x vector is defined. It can be any valid fully scoped Gazebo Link name or the special reserved "world" frame. If left empty, use the sensor's own local frame. Topic on which data is published. DEPRECATED. Use the topic element that is a child of the sensor element. These elements are specific to body-frame angular velocity, which is expressed in radians per second Angular velocity about the X axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Angular velocity about the Y axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Angular velocity about the Z axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to body-frame linear acceleration, which is expressed in meters per second squared Linear acceleration about the X axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Linear acceleration about the Y axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Linear acceleration about the Z axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the lidar sensor. The number of simulated lidar rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle The number of simulated lidar rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle specifies range properties of each simulated lidar The minimum distance for each lidar ray. The maximum distance for each lidar ray. Linear resolution of each lidar ray. The properties of the noise model that should be applied to generated scans The type of noise. Currently supported types are: "gaussian" (draw noise values independently for each beam from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. These elements are specific to logical camera sensors. A logical camera reports objects that fall within a frustum. Computation should be performed on the CPU. Near clipping distance of the view frustum Far clipping distance of the view frustum Aspect ratio of the near and far planes. This is the width divided by the height of the near or far planes. Horizontal field of view of the frustum, in radians. This is the angle between the frustum's vertex and the edges of the near or far plane. These elements are specific to a Magnetometer sensor. Parameters related to the body-frame X axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to the body-frame Y axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to the body-frame Z axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the ray (laser) sensor. The number of simulated rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle The number of simulated rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle specifies range properties of each simulated ray The minimum distance for each ray. The maximum distance for each ray. Linear resolution of each ray. The properties of the noise model that should be applied to generated scans The type of noise. Currently supported types are: "gaussian" (draw noise values independently for each beam from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. These elements are specific to the sonar sensor. The sonar collision shape. Currently supported geometries are: "cone" and "sphere". Minimum range Max range Radius of the sonar cone at max range. This parameter is only used if geometry is "cone". These elements are specific to a wireless transceiver. Service set identifier (network name) Specifies the frequency of transmission in MHz Only a frequency range is filtered. Here we set the lower bound (MHz). Only a frequency range is filtered. Here we set the upper bound (MHz). Specifies the antenna gain in dBi Specifies the transmission power in dBm Mininum received signal power in dBm These elements are specific to the force torque sensor. Frame in which to report the wrench values. Currently supported frames are: "parent" report the wrench expressed in the orientation of the parent link frame, "child" report the wrench expressed in the orientation of the child link frame, "sensor" report the wrench expressed in the orientation of the joint sensor frame. Note that for each option the point with respect to which the torque component of the wrench is expressed is the joint origin. Direction of the wrench measured by the sensor. The supported options are: "parent_to_child" if the measured wrench is the one applied by parent link on the child link, "child_to_parent" if the measured wrench is the one applied by the child link on the parent link. A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. A special kind of model which can have a scripted motion. This includes both global waypoint type animations and skeleton animations. A unique name for the actor. (DEPRECATION WARNING: This is deprecated in 1.6 and removed in 1.7. Actors should be static, so this is always true. Skin file which defines a visual and the underlying skeleton which moves it. Path to skin file, accepted formats: COLLADA, BVH. Scale the skin's size. Animation file defines an animation for the skeleton in the skin. The skeleton must be compatible with the skin skeleton. Unique name for animation. Path to animation file. Accepted formats: COLLADA, BVH. Scale for the animation skeleton. Set to true so the animation is interpolated on X. Adds scripted trajectories to the actor. Set this to true for the script to be repeated in a loop. For a fluid continuous motion, make sure the last waypoint matches the first one. This is the time to wait before starting the script. If running in a loop, this time will be waited before starting each cycle. Set to true if the animation should start as soon as the simulation starts playing. It is useful to set this to false if the animation should only start playing only when triggered by a plugin, for example. The trajectory contains a series of keyframes to be followed. Unique id for a trajectory. If it matches the type of an animation, they will be played at the same time. The tension of the trajectory spline. The default value of zero equates to a Catmull-Rom spline, which may also cause the animation to overshoot keyframes. A value of one will cause the animation to stick to the keyframes. Each point in the trajectory. The time in seconds, counted from the beginning of the script, when the pose should be reached. The pose which should be reached at the given time. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A physical link with inertia, collision, and visual properties. A link must be a child of a model, and any number of links may exist in a model. A unique name for the link within the scope of the model. If true, the link is affected by gravity. If true, the link is affected by the wind. If true, the link can collide with other links in the model. Two links within a model will collide if link1.self_collide OR link2.self_collide. Links connected by a joint will never collide. If true, the link is kinematic only If true, the link will have 6DOF and be a direct child of world. Exponential damping of the link's velocity. Linear damping Angular damping A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. The inertial properties of the link. The mass of the link. The 3x3 rotational inertia matrix. Because the rotational inertia matrix is symmetric, only 6 above-diagonal elements of this matrix are specified here, using the attributes ixx, ixy, ixz, iyy, iyz, izz. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. This is the pose of the inertial reference frame, relative to the specified reference frame. The origin of the inertial reference frame needs to be at the center of gravity. The axes of the inertial reference frame do not need to be aligned with the principal axes of the inertia. Name of frame which the pose is defined relative to. The collision properties of a link. Note that this can be different from the visual properties of a link, for example, simpler collision models are often used to reduce computation time. Unique name for the collision element within the scope of the parent link. intensity value returned by laser sensor. Maximum number of contacts allowed between two entities. This value overrides the max_contacts element defined in physics. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. The shape of the visual or collision object. You can use the empty tag to make empty geometries. Box shape The three side lengths of the box. The origin of the box is in its geometric center (inside the center of the box). Cylinder shape Radius of the cylinder Length of the cylinder A heightmap based on a 2d grayscale image. URI to a grayscale image file The size of the heightmap in world units. When loading an image: "size" is used if present, otherwise defaults to 1x1x1. When loading a DEM: "size" is used if present, otherwise defaults to true size of DEM. A position offset. The heightmap can contain multiple textures. The order of the texture matters. The first texture will appear at the lowest height, and the last texture at the highest height. Use blend to control the height thresholds and fade between textures. Size of the applied texture in meters. Diffuse texture image filename Normalmap texture image filename The blend tag controls how two adjacent textures are mixed. The number of blend elements should equal one less than the number of textures. Min height of a blend layer Distance over which the blend occurs Set if the rendering engine will use terrain paging Samples per heightmap datum. For rasterized heightmaps, this indicates the number of samples to take per pixel. Using a lower value, e.g. 1, will generally improve the performance of the heightmap but lower the heightmap quality. Extrude a set of boxes from a grayscale image. URI of the grayscale image file Scaling factor applied to the image Grayscale threshold Height of the extruded boxes The amount of error in the model Mesh shape Mesh uri Use a named submesh. The submesh must exist in the mesh specified by the uri Name of the submesh within the parent mesh Set to true to center the vertices of the submesh at 0,0,0. This will effectively remove any transformations on the submesh before the poses from parent links and models are applied. Scaling factor applied to the mesh Plane shape Normal direction for the plane. When a Plane is used as a geometry for a Visual or Collision object, then the normal is specified in the Visual or Collision frame, respectively. Length of each side of the plane. Note that this property is meaningful only for visualizing the Plane, i.e., when the Plane is used as a geometry for a Visual object. The Plane has infinite size when used as a geometry for a Collision object. Defines an extruded polyline shape A series of points that define the path of the polyline. Height of the polyline Sphere shape radius of the sphere The surface parameters Bounciness coefficient of restitution, from [0...1], where 0=no bounciness. Bounce capture velocity, below which effective coefficient of restitution is 0. Parameters for torsional friction Torsional friction coefficient, unitless maximum ratio of tangential stress to normal stress. If this flag is true, torsional friction is calculated using the "patch_radius" parameter. If this flag is set to false, "surface_radius" (R) and contact depth (d) are used to compute the patch radius as sqrt(R*d). Radius of contact patch surface. Surface radius on the point of contact. Torsional friction parameters for ODE Force dependent slip for torsional friction, equivalent to inverse of viscous damping coefficient with units of rad/s/(Nm). A slip value of 0 is infinitely viscous. ODE friction parameters Coefficient of friction in first friction pyramid direction, the unitless maximum ratio of force in first friction pyramid direction to normal force. Coefficient of friction in second friction pyramid direction, the unitless maximum ratio of force in second friction pyramid direction to normal force. Unit vector specifying first friction pyramid direction in collision-fixed reference frame. If the friction pyramid model is in use, and this value is set to a unit vector for one of the colliding surfaces, the ODE Collide callback function will align the friction pyramid directions with a reference frame fixed to that collision surface. If both surfaces have this value set to a vector of zeros, the friction pyramid directions will be aligned with the world frame. If this value is set for both surfaces, the behavior is undefined. Force dependent slip in first friction pyramid direction, equivalent to inverse of viscous damping coefficient with units of m/s/N. A slip value of 0 is infinitely viscous. Force dependent slip in second friction pyramid direction, equivalent to inverse of viscous damping coefficient with units of m/s/N. A slip value of 0 is infinitely viscous. Coefficient of friction in first friction pyramid direction, the unitless maximum ratio of force in first friction pyramid direction to normal force. Coefficient of friction in second friction pyramid direction, the unitless maximum ratio of force in second friction pyramid direction to normal force. Unit vector specifying first friction pyramid direction in collision-fixed reference frame. If the friction pyramid model is in use, and this value is set to a unit vector for one of the colliding surfaces, the friction pyramid directions will be aligned with a reference frame fixed to that collision surface. If both surfaces have this value set to a vector of zeros, the friction pyramid directions will be aligned with the world frame. If this value is set for both surfaces, the behavior is undefined. Coefficient of rolling friction Flag to disable contact force generation, while still allowing collision checks and contact visualization to occur. Bitmask for collision filtering when collide_without_contact is on Bitmask for collision filtering. This will override collide_without_contact Bitmask for category of collision filtering. Collision happens if ((category1 & collision2) | (category2 & collision1)) is not zero. If not specified, the category_bitmask should be interpreted as being the same as collide_bitmask. Poisson's ratio is the unitless ratio between transverse and axial strain. This value must lie between (-1, 0.5). Defaults to 0.3 for typical steel. Note typical silicone elastomers have Poisson's ratio near 0.49 ~ 0.50. For reference, approximate values for Material:(Young's Modulus, Poisson's Ratio) for some of the typical materials are: Plastic: (1e8 ~ 3e9 Pa, 0.35 ~ 0.41), Wood: (4e9 ~ 1e10 Pa, 0.22 ~ 0.50), Aluminum: (7e10 Pa, 0.32 ~ 0.35), Steel: (2e11 Pa, 0.26 ~ 0.31). Young's Modulus in SI derived unit Pascal. Defaults to -1. If value is less or equal to zero, contact using elastic modulus (with Poisson's Ratio) is disabled. For reference, approximate values for Material:(Young's Modulus, Poisson's Ratio) for some of the typical materials are: Plastic: (1e8 ~ 3e9 Pa, 0.35 ~ 0.41), Wood: (4e9 ~ 1e10 Pa, 0.22 ~ 0.50), Aluminum: (7e10 Pa, 0.32 ~ 0.35), Steel: (2e11 Pa, 0.26 ~ 0.31). ODE contact parameters Soft constraint force mixing. Soft error reduction parameter dynamically "stiffness"-equivalent coefficient for contact joints dynamically "damping"-equivalent coefficient for contact joints maximum contact correction velocity truncation term. minimum allowable depth before contact correction impulse is applied Bullet contact parameters Soft constraint force mixing. Soft error reduction parameter dynamically "stiffness"-equivalent coefficient for contact joints dynamically "damping"-equivalent coefficient for contact joints Similar to ODE's max_vel implementation. See http://bulletphysics.org/mediawiki-1.5.8/index.php/BtContactSolverInfo#Split_Impulse for more information. Similar to ODE's max_vel implementation. See http://bulletphysics.org/mediawiki-1.5.8/index.php/BtContactSolverInfo#Split_Impulse for more information. soft contact pamameters based on paper: http://www.cc.gatech.edu/graphics/projects/Sumit/homepage/papers/sigasia11/jain_softcontacts_siga11.pdf This is variable k_v in the soft contacts paper. Its unit is N/m. This is variable k_e in the soft contacts paper. Its unit is N/m. Viscous damping of point velocity in body frame. Its unit is N/m/s. Fraction of mass to be distributed among deformable nodes. The visual properties of the link. This element specifies the shape of the object (box, cylinder, etc.) for visualization purposes. Unique name for the visual element within the scope of the parent link. If true the visual will cast shadows. will be implemented in the future release. The amount of transparency( 0=opaque, 1 = fully transparent) Optional meta information for the visual. The information contained within this element should be used to provide additional feedback to an end user. The layer in which this visual is displayed. The layer number is useful for programs, such as Gazebo, that put visuals in different layers for enhanced visualization. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. The material of the visual element. Name of material from an installed script file. This will override the color element if the script exists. URI of the material script file Name of the script within the script file vertex, pixel, normal_map_objectspace, normal_map_tangentspace filename of the normal map If false, dynamic lighting will be disabled The ambient color of a material specified by set of four numbers representing red/green/blue, each in the range of [0,1]. The diffuse color of a material specified by set of four numbers representing red/green/blue/alpha, each in the range of [0,1]. The specular color of a material specified by set of four numbers representing red/green/blue/alpha, each in the range of [0,1]. The emissive color of a material specified by set of four numbers representing red/green/blue, each in the range of [0,1]. Physically Based Rendering (PBR) material. There are two PBR workflows: metal and specular. While both workflows and their parameters can be specified at the same time, typically only one of them will be used (depending on the underlying renderer capability). It is also recommended to use the same workflow for all materials in the world. PBR using the Metallic/Roughness workflow. Filename of the diffuse/albedo map. Filename of the roughness map. Material roughness in the range of [0,1], where 0 represents a smooth surface and 1 represents a rough surface. This is the inverse of a specular map in a PBR specular workflow. Filename of the metalness map. Material metalness in the range of [0,1], where 0 represents non-metal and 1 represents raw metal Filename of the environment / reflection map, typically in the form of a cubemap Filename of the ambient occlusion map. The map defines the amount of ambient lighting on the surface. Filename of the normal map. The normals can be in the object space or tangent space as specified in the 'type' attribute The space that the normals are in. Values are: 'object' or 'tangent' Filename of the emissive map. PBR using the Specular/Glossiness workflow. Filename of the diffuse/albedo map. Filename of the specular map. Filename of the glossiness map. Material glossiness in the range of [0-1], where 0 represents a rough surface and 1 represents a smooth surface. This is the inverse of a roughness map in a PBR metal workflow. Filename of the ambient occlusion map. The map defines the amount of ambient lighting on the surface. Filename of the normal map. The normals can be in the object space or tangent space as specified in the 'type' attribute The space that the normals are in. Values are: 'object' or 'tangent' Filename of the emissive map. The shape of the visual or collision object. You can use the empty tag to make empty geometries. Box shape The three side lengths of the box. The origin of the box is in its geometric center (inside the center of the box). Cylinder shape Radius of the cylinder Length of the cylinder A heightmap based on a 2d grayscale image. URI to a grayscale image file The size of the heightmap in world units. When loading an image: "size" is used if present, otherwise defaults to 1x1x1. When loading a DEM: "size" is used if present, otherwise defaults to true size of DEM. A position offset. The heightmap can contain multiple textures. The order of the texture matters. The first texture will appear at the lowest height, and the last texture at the highest height. Use blend to control the height thresholds and fade between textures. Size of the applied texture in meters. Diffuse texture image filename Normalmap texture image filename The blend tag controls how two adjacent textures are mixed. The number of blend elements should equal one less than the number of textures. Min height of a blend layer Distance over which the blend occurs Set if the rendering engine will use terrain paging Samples per heightmap datum. For rasterized heightmaps, this indicates the number of samples to take per pixel. Using a lower value, e.g. 1, will generally improve the performance of the heightmap but lower the heightmap quality. Extrude a set of boxes from a grayscale image. URI of the grayscale image file Scaling factor applied to the image Grayscale threshold Height of the extruded boxes The amount of error in the model Mesh shape Mesh uri Use a named submesh. The submesh must exist in the mesh specified by the uri Name of the submesh within the parent mesh Set to true to center the vertices of the submesh at 0,0,0. This will effectively remove any transformations on the submesh before the poses from parent links and models are applied. Scaling factor applied to the mesh Plane shape Normal direction for the plane. When a Plane is used as a geometry for a Visual or Collision object, then the normal is specified in the Visual or Collision frame, respectively. Length of each side of the plane. Note that this property is meaningful only for visualizing the Plane, i.e., when the Plane is used as a geometry for a Visual object. The Plane has infinite size when used as a geometry for a Collision object. Defines an extruded polyline shape A series of points that define the path of the polyline. Height of the polyline Sphere shape radius of the sphere A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. The sensor tag describes the type and properties of a sensor. A unique name for the sensor. This name must not match another model in the model. The type name of the sensor. By default, SDFormat supports types air_pressure, altimeter, camera, contact, depth_camera, force_torque, gps, gpu_lidar, gpu_ray, imu, lidar, logical_camera, magnetometer, multicamera, ray, rfid, rfidtag, rgbd_camera, sonar, thermal_camera, wireless_receiver, and wireless_transmitter. The "ray" and "gpu_ray" types are equivalent to "lidar" and "gpu_lidar", respectively. It is preferred to use "lidar" and "gpu_lidar" since "ray" and "gpu_ray" will be deprecated. The "ray" and "gpu_ray" types are maintained for legacy support. If true the sensor will always be updated according to the update rate. The frequency at which the sensor data is generated. If left unspecified, the sensor will generate data every cycle. If true, the sensor is visualized in the GUI Name of the topic on which data is published. This is necessary for visualization A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. These elements are specific to an air pressure sensor. The initial altitude in meters. This value can be used by a sensor implementation to augment the altitude of the sensor. For example, if you are using simulation instead of creating a 1000 m mountain model on which to place your sensor, you could instead set this value to 1000 and place your model on a ground plane with a Z height of zero. Noise parameters for the pressure data. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to an altimeter sensor. Noise parameters for vertical position The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical velocity The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to camera sensors. An optional name for the camera. Horizontal field of view The image size in pixels and format. Width in pixels Height in pixels (L8|R8G8B8|B8G8R8|BAYER_RGGB8|BAYER_BGGR8|BAYER_GBRG8|BAYER_GRBG8) The near and far clip planes. Objects closer or farther than these planes are not rendered. Near clipping plane Far clipping plane Enable or disable saving of camera frames. True = saving enabled The path name which will hold the frame data. If path name is relative, then directory is relative to current working directory. Depth camera parameters Type of output The near and far clip planes. Objects closer or farther than these planes are not detected by the depth camera. Near clipping plane for depth camera Far clipping plane for depth camera The properties of the noise model that should be applied to generated images The type of noise. Currently supported types are: "gaussian" (draw additive noise values independently for each pixel from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. Lens distortion to be applied to camera images. See http://en.wikipedia.org/wiki/Distortion_(optics)#Software_correction The radial distortion coefficient k1 The radial distortion coefficient k2 The radial distortion coefficient k3 The tangential distortion coefficient p1 The tangential distortion coefficient p2 The distortion center or principal point Lens projection description Type of the lens mapping. Supported values are gnomonical, stereographic, equidistant, equisolid_angle, orthographic, custom. For gnomonical (perspective) projection, it is recommended to specify a horizontal_fov of less than or equal to 90° If true the image will be scaled to fit horizontal FOV, otherwise it will be shown according to projection type parameters Definition of custom mapping function in a form of r=c1*f*fun(theta/c2 + c3). See https://en.wikipedia.org/wiki/Fisheye_lens#Mapping_function Linear scaling constant Angle scaling constant Angle offset constant Focal length of the optical system. Note: It's not a focal length of the lens in a common sense! This value is ignored if 'scale_to_fov' is set to true Possible values are 'sin', 'tan' and 'id' Everything outside of the specified angle will be hidden, 90° by default Resolution of the environment cube map used to draw the world Camera intrinsic parameters for setting a custom perspective projection matrix (cannot be used with WideAngleCamera since this class uses image stitching from 6 different cameras for achieving a wide field of view). The focal lengths can be computed using focal_length_in_pixels = (image_width_in_pixels * 0.5) / tan(field_of_view_in_degrees * 0.5 * PI/180) X focal length (in pixels, overrides horizontal_fov) Y focal length (in pixels, overrides horizontal_fov) X principal point (in pixels) Y principal point (in pixels) XY axis skew A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. These elements are specific to the contact sensor. name of the collision element within a link that acts as the contact sensor. Topic on which contact data is published. These elements are specific to the GPS sensor. Parameters related to GPS position measurement. Noise parameters for horizontal position measurement, in units of meters. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical position measurement, in units of meters. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to GPS position measurement. Noise parameters for horizontal velocity measurement, in units of meters/second. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical velocity measurement, in units of meters/second. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the IMU sensor. This string represents special hardcoded use cases that are commonly seen with typical robot IMU's: - CUSTOM: use Euler angle custom_rpy orientation specification. The orientation of the IMU's reference frame is defined by adding the custom_rpy rotation to the parent_frame. - NED: The IMU XYZ aligns with NED, where NED orientation relative to Gazebo world is defined by the SphericalCoordinates class. - ENU: The IMU XYZ aligns with ENU, where ENU orientation relative to Gazebo world is defined by the SphericalCoordinates class. - NWU: The IMU XYZ aligns with NWU, where NWU orientation relative to Gazebo world is defined by the SphericalCoordinates class. - GRAV_UP: where direction of gravity maps to IMU reference frame Z-axis with Z-axis pointing in the opposite direction of gravity. IMU reference frame X-axis direction is defined by grav_dir_x. Note if grav_dir_x is parallel to gravity direction, this configuration fails. Otherwise, IMU reference frame X-axis is defined by projection of grav_dir_x onto a plane normal to the gravity vector. IMU reference frame Y-axis is a vector orthogonal to both X and Z axis following the right hand rule. - GRAV_DOWN: where direction of gravity maps to IMU reference frame Z-axis with Z-axis pointing in the direction of gravity. IMU reference frame X-axis direction is defined by grav_dir_x. Note if grav_dir_x is parallel to gravity direction, this configuration fails. Otherwise, IMU reference frame X-axis is defined by projection of grav_dir_x onto a plane normal to the gravity vector. IMU reference frame Y-axis is a vector orthogonal to both X and Z axis following the right hand rule. This field and parent_frame are used when localization is set to CUSTOM. Orientation (fixed axis roll, pitch yaw) transform from parent_frame to this IMU's reference frame. Some common examples are: - IMU reports in its local frame on boot. IMU sensor frame is the reference frame. Example: parent_frame="", custom_rpy="0 0 0" - IMU reports in Gazebo world frame. Example sdf: parent_frame="world", custom_rpy="0 0 0" - IMU reports in NWU frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between North-West-Up and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="0 0 0" - IMU reports in NED frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between North-East-Down and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="M_PI 0 0" - IMU reports in ENU frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between East-North-Up and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="0 0 -0.5*M_PI" - IMU reports in ROS optical frame as described in http://www.ros.org/reps/rep-0103.html#suffix-frames, which is (z-forward, x-left to right when facing +z, y-top to bottom when facing +z). (default gazebo camera is +x:view direction, +y:left, +z:up). Example sdf: parent_frame="local", custom_rpy="-0.5*M_PI 0 -0.5*M_PI" Name of parent frame which the custom_rpy transform is defined relative to. It can be any valid fully scoped Gazebo Link name or the special reserved "world" frame. If left empty, use the sensor's own local frame. Used when localization is set to GRAV_UP or GRAV_DOWN, a projection of this vector into a plane that is orthogonal to the gravity vector defines the direction of the IMU reference frame's X-axis. grav_dir_x is defined in the coordinate frame as defined by the parent_frame element. Name of parent frame in which the grav_dir_x vector is defined. It can be any valid fully scoped Gazebo Link name or the special reserved "world" frame. If left empty, use the sensor's own local frame. Topic on which data is published. DEPRECATED. Use the topic element that is a child of the sensor element. These elements are specific to body-frame angular velocity, which is expressed in radians per second Angular velocity about the X axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Angular velocity about the Y axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Angular velocity about the Z axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to body-frame linear acceleration, which is expressed in meters per second squared Linear acceleration about the X axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Linear acceleration about the Y axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Linear acceleration about the Z axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the lidar sensor. The number of simulated lidar rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle The number of simulated lidar rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle specifies range properties of each simulated lidar The minimum distance for each lidar ray. The maximum distance for each lidar ray. Linear resolution of each lidar ray. The properties of the noise model that should be applied to generated scans The type of noise. Currently supported types are: "gaussian" (draw noise values independently for each beam from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. These elements are specific to logical camera sensors. A logical camera reports objects that fall within a frustum. Computation should be performed on the CPU. Near clipping distance of the view frustum Far clipping distance of the view frustum Aspect ratio of the near and far planes. This is the width divided by the height of the near or far planes. Horizontal field of view of the frustum, in radians. This is the angle between the frustum's vertex and the edges of the near or far plane. These elements are specific to a Magnetometer sensor. Parameters related to the body-frame X axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to the body-frame Y axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to the body-frame Z axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the ray (laser) sensor. The number of simulated rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle The number of simulated rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle specifies range properties of each simulated ray The minimum distance for each ray. The maximum distance for each ray. Linear resolution of each ray. The properties of the noise model that should be applied to generated scans The type of noise. Currently supported types are: "gaussian" (draw noise values independently for each beam from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. These elements are specific to the sonar sensor. The sonar collision shape. Currently supported geometries are: "cone" and "sphere". Minimum range Max range Radius of the sonar cone at max range. This parameter is only used if geometry is "cone". These elements are specific to a wireless transceiver. Service set identifier (network name) Specifies the frequency of transmission in MHz Only a frequency range is filtered. Here we set the lower bound (MHz). Only a frequency range is filtered. Here we set the upper bound (MHz). Specifies the antenna gain in dBi Specifies the transmission power in dBm Mininum received signal power in dBm These elements are specific to the force torque sensor. Frame in which to report the wrench values. Currently supported frames are: "parent" report the wrench expressed in the orientation of the parent link frame, "child" report the wrench expressed in the orientation of the child link frame, "sensor" report the wrench expressed in the orientation of the joint sensor frame. Note that for each option the point with respect to which the torque component of the wrench is expressed is the joint origin. Direction of the wrench measured by the sensor. The supported options are: "parent_to_child" if the measured wrench is the one applied by parent link on the child link, "child_to_parent" if the measured wrench is the one applied by the child link on the parent link. Name of the projector Texture name Field of view Near clip distance far clip distance A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. An audio sink. An audio source. URI of the audio media. Pitch for the audio media, in Hz Gain for the audio media, in dB. List of collision objects that will trigger audio playback. Name of child collision element that will trigger audio playback. True to make the audio source loop playback. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. Description of a battery. Unique name for the battery. Initial voltage in volts. The light element describes a light source. A unique name for the light. The light type: point, directional, spot. When true, the light will cast shadows. Diffuse light color Specular light color Light attenuation Range of the light The linear attenuation factor: 1 means attenuate evenly over the distance. The constant attenuation factor: 1.0 means never attenuate, 0.0 is complete attenutation. The quadratic attenuation factor: adds a curvature to the attenuation. Direction of the light, only applicable for spot and directional lights. Spot light parameters Angle covered by the bright inner cone Angle covered by the outer cone The rate of falloff between the inner and outer cones. 1.0 means a linear falloff, less means slower falloff, higher means faster falloff. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A joint connects two links with kinematic and dynamic properties. By default, the pose of a joint is expressed in the child link frame. A unique name for the joint within the scope of the model. The type of joint, which must be one of the following: (continuous) a hinge joint that rotates on a single axis with a continuous range of motion, (revolute) a hinge joint that rotates on a single axis with a fixed range of motion, (gearbox) geared revolute joints, (revolute2) same as two revolute joints connected in series, (prismatic) a sliding joint that slides along an axis with a limited range specified by upper and lower limits, (ball) a ball and socket joint, (screw) a single degree of freedom joint with coupled sliding and rotational motion, (universal) like a ball joint, but constrains one degree of freedom, (fixed) a joint with zero degrees of freedom that rigidly connects two links. Name of the parent link Name of the child link Parameter for gearbox joints. Given theta_1 and theta_2 defined in description for gearbox_reference_body, theta_2 = -gearbox_ratio * theta_1. Parameter for gearbox joints. Gearbox ratio is enforced over two joint angles. First joint angle (theta_1) is the angle from the gearbox_reference_body to the parent link in the direction of the axis element and the second joint angle (theta_2) is the angle from the gearbox_reference_body to the child link in the direction of the axis2 element. Parameter for screw joints. Parameters related to the axis of rotation for revolute joints, the axis of translation for prismatic joints. Default joint position for this joint axis. Represents the x,y,z components of the axis unit vector. The axis is expressed in the joint frame unless the use_parent_model_frame flag is set to true. The vector should be normalized. Flag to interpret the axis xyz element in the parent model frame instead of joint frame. Provided for Gazebo compatibility (see https://bitbucket.org/osrf/gazebo/issue/494 ). An element specifying physical properties of the joint. These values are used to specify modeling properties of the joint, particularly useful for simulation. The physical velocity dependent viscous damping coefficient of the joint. The physical static friction value of the joint. The spring reference position for this joint axis. The spring stiffness for this joint axis. specifies the limits of this joint Specifies the lower joint limit (radians for revolute joints, meters for prismatic joints). Omit if joint is continuous. Specifies the upper joint limit (radians for revolute joints, meters for prismatic joints). Omit if joint is continuous. A value for enforcing the maximum joint effort applied. Limit is not enforced if value is negative. A value for enforcing the maximum joint velocity. Joint stop stiffness. Joint stop dissipation. Parameters related to the second axis of rotation for revolute2 joints and universal joints. Default joint position for this joint axis. Represents the x,y,z components of the axis unit vector. The axis is expressed in the joint frame unless the use_parent_model_frame flag is set to true. The vector should be normalized. Flag to interpret the axis xyz element in the parent model frame instead of joint frame. Provided for Gazebo compatibility (see https://bitbucket.org/osrf/gazebo/issue/494 ). An element specifying physical properties of the joint. These values are used to specify modeling properties of the joint, particularly useful for simulation. The physical velocity dependent viscous damping coefficient of the joint. EXPERIMENTAL: if damping coefficient is negative and implicit_spring_damper is true, adaptive damping is used. The physical static friction value of the joint. The spring reference position for this joint axis. The spring stiffness for this joint axis. An attribute specifying the lower joint limit (radians for revolute joints, meters for prismatic joints). Omit if joint is continuous. An attribute specifying the upper joint limit (radians for revolute joints, meters for prismatic joints). Omit if joint is continuous. An attribute for enforcing the maximum joint effort applied by Joint::SetForce. Limit is not enforced if value is negative. (not implemented) An attribute for enforcing the maximum joint velocity. Joint stop stiffness. Supported physics engines: SimBody. Joint stop dissipation. Supported physics engines: SimBody. Parameters that are specific to a certain physics engine. Simbody specific parameters Force cut in the multibody graph at this joint. ODE specific parameters (DEPRECATION WARNING: In SDFormat 1.5 this tag will be replaced by the same tag directly under the physics-block. For now, this tag overrides the one outside of ode-block, but in SDFormat 1.5 this tag will be removed completely.) If provide feedback is set to true, ODE will compute the constraint forces at this joint. If cfm damping is set to true, ODE will use CFM to simulate damping, allows for infinite damping, and one additional constraint row (previously used for joint limit) is always active. If implicit_spring_damper is set to true, ODE will use CFM, ERP to simulate stiffness and damping, allows for infinite damping, and one additional constraint row (previously used for joint limit) is always active. This replaces cfm_damping parameter in SDFormat 1.4. Scale the excess for in a joint motor at joint limits. Should be between zero and one. Constraint force mixing for constrained directions Error reduction parameter for constrained directions Bounciness of the limits Maximum force or torque used to reach the desired velocity. The desired velocity of the joint. Should only be set if you want the joint to move on load. Constraint force mixing parameter used by the joint stop Error reduction parameter used by the joint stop Suspension constraint force mixing parameter Suspension error reduction parameter If provide feedback is set to true, physics engine will compute the constraint forces at this joint. For now, provide_feedback under ode block will override this tag and given user warning about the migration. provide_feedback under ode is scheduled to be removed in SDFormat 1.5. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. The sensor tag describes the type and properties of a sensor. A unique name for the sensor. This name must not match another model in the model. The type name of the sensor. By default, SDFormat supports types air_pressure, altimeter, camera, contact, depth_camera, force_torque, gps, gpu_lidar, gpu_ray, imu, lidar, logical_camera, magnetometer, multicamera, ray, rfid, rfidtag, rgbd_camera, sonar, thermal_camera, wireless_receiver, and wireless_transmitter. The "ray" and "gpu_ray" types are equivalent to "lidar" and "gpu_lidar", respectively. It is preferred to use "lidar" and "gpu_lidar" since "ray" and "gpu_ray" will be deprecated. The "ray" and "gpu_ray" types are maintained for legacy support. If true the sensor will always be updated according to the update rate. The frequency at which the sensor data is generated. If left unspecified, the sensor will generate data every cycle. If true, the sensor is visualized in the GUI Name of the topic on which data is published. This is necessary for visualization A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. These elements are specific to an air pressure sensor. The initial altitude in meters. This value can be used by a sensor implementation to augment the altitude of the sensor. For example, if you are using simulation instead of creating a 1000 m mountain model on which to place your sensor, you could instead set this value to 1000 and place your model on a ground plane with a Z height of zero. Noise parameters for the pressure data. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to an altimeter sensor. Noise parameters for vertical position The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical velocity The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to camera sensors. An optional name for the camera. Horizontal field of view The image size in pixels and format. Width in pixels Height in pixels (L8|R8G8B8|B8G8R8|BAYER_RGGB8|BAYER_BGGR8|BAYER_GBRG8|BAYER_GRBG8) The near and far clip planes. Objects closer or farther than these planes are not rendered. Near clipping plane Far clipping plane Enable or disable saving of camera frames. True = saving enabled The path name which will hold the frame data. If path name is relative, then directory is relative to current working directory. Depth camera parameters Type of output The near and far clip planes. Objects closer or farther than these planes are not detected by the depth camera. Near clipping plane for depth camera Far clipping plane for depth camera The properties of the noise model that should be applied to generated images The type of noise. Currently supported types are: "gaussian" (draw additive noise values independently for each pixel from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. Lens distortion to be applied to camera images. See http://en.wikipedia.org/wiki/Distortion_(optics)#Software_correction The radial distortion coefficient k1 The radial distortion coefficient k2 The radial distortion coefficient k3 The tangential distortion coefficient p1 The tangential distortion coefficient p2 The distortion center or principal point Lens projection description Type of the lens mapping. Supported values are gnomonical, stereographic, equidistant, equisolid_angle, orthographic, custom. For gnomonical (perspective) projection, it is recommended to specify a horizontal_fov of less than or equal to 90° If true the image will be scaled to fit horizontal FOV, otherwise it will be shown according to projection type parameters Definition of custom mapping function in a form of r=c1*f*fun(theta/c2 + c3). See https://en.wikipedia.org/wiki/Fisheye_lens#Mapping_function Linear scaling constant Angle scaling constant Angle offset constant Focal length of the optical system. Note: It's not a focal length of the lens in a common sense! This value is ignored if 'scale_to_fov' is set to true Possible values are 'sin', 'tan' and 'id' Everything outside of the specified angle will be hidden, 90° by default Resolution of the environment cube map used to draw the world Camera intrinsic parameters for setting a custom perspective projection matrix (cannot be used with WideAngleCamera since this class uses image stitching from 6 different cameras for achieving a wide field of view). The focal lengths can be computed using focal_length_in_pixels = (image_width_in_pixels * 0.5) / tan(field_of_view_in_degrees * 0.5 * PI/180) X focal length (in pixels, overrides horizontal_fov) Y focal length (in pixels, overrides horizontal_fov) X principal point (in pixels) Y principal point (in pixels) XY axis skew A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. These elements are specific to the contact sensor. name of the collision element within a link that acts as the contact sensor. Topic on which contact data is published. These elements are specific to the GPS sensor. Parameters related to GPS position measurement. Noise parameters for horizontal position measurement, in units of meters. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical position measurement, in units of meters. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to GPS position measurement. Noise parameters for horizontal velocity measurement, in units of meters/second. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical velocity measurement, in units of meters/second. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the IMU sensor. This string represents special hardcoded use cases that are commonly seen with typical robot IMU's: - CUSTOM: use Euler angle custom_rpy orientation specification. The orientation of the IMU's reference frame is defined by adding the custom_rpy rotation to the parent_frame. - NED: The IMU XYZ aligns with NED, where NED orientation relative to Gazebo world is defined by the SphericalCoordinates class. - ENU: The IMU XYZ aligns with ENU, where ENU orientation relative to Gazebo world is defined by the SphericalCoordinates class. - NWU: The IMU XYZ aligns with NWU, where NWU orientation relative to Gazebo world is defined by the SphericalCoordinates class. - GRAV_UP: where direction of gravity maps to IMU reference frame Z-axis with Z-axis pointing in the opposite direction of gravity. IMU reference frame X-axis direction is defined by grav_dir_x. Note if grav_dir_x is parallel to gravity direction, this configuration fails. Otherwise, IMU reference frame X-axis is defined by projection of grav_dir_x onto a plane normal to the gravity vector. IMU reference frame Y-axis is a vector orthogonal to both X and Z axis following the right hand rule. - GRAV_DOWN: where direction of gravity maps to IMU reference frame Z-axis with Z-axis pointing in the direction of gravity. IMU reference frame X-axis direction is defined by grav_dir_x. Note if grav_dir_x is parallel to gravity direction, this configuration fails. Otherwise, IMU reference frame X-axis is defined by projection of grav_dir_x onto a plane normal to the gravity vector. IMU reference frame Y-axis is a vector orthogonal to both X and Z axis following the right hand rule. This field and parent_frame are used when localization is set to CUSTOM. Orientation (fixed axis roll, pitch yaw) transform from parent_frame to this IMU's reference frame. Some common examples are: - IMU reports in its local frame on boot. IMU sensor frame is the reference frame. Example: parent_frame="", custom_rpy="0 0 0" - IMU reports in Gazebo world frame. Example sdf: parent_frame="world", custom_rpy="0 0 0" - IMU reports in NWU frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between North-West-Up and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="0 0 0" - IMU reports in NED frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between North-East-Down and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="M_PI 0 0" - IMU reports in ENU frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between East-North-Up and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="0 0 -0.5*M_PI" - IMU reports in ROS optical frame as described in http://www.ros.org/reps/rep-0103.html#suffix-frames, which is (z-forward, x-left to right when facing +z, y-top to bottom when facing +z). (default gazebo camera is +x:view direction, +y:left, +z:up). Example sdf: parent_frame="local", custom_rpy="-0.5*M_PI 0 -0.5*M_PI" Name of parent frame which the custom_rpy transform is defined relative to. It can be any valid fully scoped Gazebo Link name or the special reserved "world" frame. If left empty, use the sensor's own local frame. Used when localization is set to GRAV_UP or GRAV_DOWN, a projection of this vector into a plane that is orthogonal to the gravity vector defines the direction of the IMU reference frame's X-axis. grav_dir_x is defined in the coordinate frame as defined by the parent_frame element. Name of parent frame in which the grav_dir_x vector is defined. It can be any valid fully scoped Gazebo Link name or the special reserved "world" frame. If left empty, use the sensor's own local frame. Topic on which data is published. DEPRECATED. Use the topic element that is a child of the sensor element. These elements are specific to body-frame angular velocity, which is expressed in radians per second Angular velocity about the X axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Angular velocity about the Y axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Angular velocity about the Z axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to body-frame linear acceleration, which is expressed in meters per second squared Linear acceleration about the X axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Linear acceleration about the Y axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Linear acceleration about the Z axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the lidar sensor. The number of simulated lidar rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle The number of simulated lidar rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle specifies range properties of each simulated lidar The minimum distance for each lidar ray. The maximum distance for each lidar ray. Linear resolution of each lidar ray. The properties of the noise model that should be applied to generated scans The type of noise. Currently supported types are: "gaussian" (draw noise values independently for each beam from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. These elements are specific to logical camera sensors. A logical camera reports objects that fall within a frustum. Computation should be performed on the CPU. Near clipping distance of the view frustum Far clipping distance of the view frustum Aspect ratio of the near and far planes. This is the width divided by the height of the near or far planes. Horizontal field of view of the frustum, in radians. This is the angle between the frustum's vertex and the edges of the near or far plane. These elements are specific to a Magnetometer sensor. Parameters related to the body-frame X axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to the body-frame Y axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to the body-frame Z axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the ray (laser) sensor. The number of simulated rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle The number of simulated rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle specifies range properties of each simulated ray The minimum distance for each ray. The maximum distance for each ray. Linear resolution of each ray. The properties of the noise model that should be applied to generated scans The type of noise. Currently supported types are: "gaussian" (draw noise values independently for each beam from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. These elements are specific to the sonar sensor. The sonar collision shape. Currently supported geometries are: "cone" and "sphere". Minimum range Max range Radius of the sonar cone at max range. This parameter is only used if geometry is "cone". These elements are specific to a wireless transceiver. Service set identifier (network name) Specifies the frequency of transmission in MHz Only a frequency range is filtered. Here we set the lower bound (MHz). Only a frequency range is filtered. Here we set the upper bound (MHz). Specifies the antenna gain in dBi Specifies the transmission power in dBm Mininum received signal power in dBm These elements are specific to the force torque sensor. Frame in which to report the wrench values. Currently supported frames are: "parent" report the wrench expressed in the orientation of the parent link frame, "child" report the wrench expressed in the orientation of the child link frame, "sensor" report the wrench expressed in the orientation of the joint sensor frame. Note that for each option the point with respect to which the torque component of the wrench is expressed is the joint origin. Direction of the wrench measured by the sensor. The supported options are: "parent_to_child" if the measured wrench is the one applied by parent link on the child link, "child_to_parent" if the measured wrench is the one applied by the child link on the parent link. A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. A joint connects two links with kinematic and dynamic properties. By default, the pose of a joint is expressed in the child link frame. A unique name for the joint within the scope of the model. The type of joint, which must be one of the following: (continuous) a hinge joint that rotates on a single axis with a continuous range of motion, (revolute) a hinge joint that rotates on a single axis with a fixed range of motion, (gearbox) geared revolute joints, (revolute2) same as two revolute joints connected in series, (prismatic) a sliding joint that slides along an axis with a limited range specified by upper and lower limits, (ball) a ball and socket joint, (screw) a single degree of freedom joint with coupled sliding and rotational motion, (universal) like a ball joint, but constrains one degree of freedom, (fixed) a joint with zero degrees of freedom that rigidly connects two links. Name of the parent link Name of the child link Parameter for gearbox joints. Given theta_1 and theta_2 defined in description for gearbox_reference_body, theta_2 = -gearbox_ratio * theta_1. Parameter for gearbox joints. Gearbox ratio is enforced over two joint angles. First joint angle (theta_1) is the angle from the gearbox_reference_body to the parent link in the direction of the axis element and the second joint angle (theta_2) is the angle from the gearbox_reference_body to the child link in the direction of the axis2 element. Parameter for screw joints. Parameters related to the axis of rotation for revolute joints, the axis of translation for prismatic joints. Default joint position for this joint axis. Represents the x,y,z components of the axis unit vector. The axis is expressed in the joint frame unless the use_parent_model_frame flag is set to true. The vector should be normalized. Flag to interpret the axis xyz element in the parent model frame instead of joint frame. Provided for Gazebo compatibility (see https://bitbucket.org/osrf/gazebo/issue/494 ). An element specifying physical properties of the joint. These values are used to specify modeling properties of the joint, particularly useful for simulation. The physical velocity dependent viscous damping coefficient of the joint. The physical static friction value of the joint. The spring reference position for this joint axis. The spring stiffness for this joint axis. specifies the limits of this joint Specifies the lower joint limit (radians for revolute joints, meters for prismatic joints). Omit if joint is continuous. Specifies the upper joint limit (radians for revolute joints, meters for prismatic joints). Omit if joint is continuous. A value for enforcing the maximum joint effort applied. Limit is not enforced if value is negative. A value for enforcing the maximum joint velocity. Joint stop stiffness. Joint stop dissipation. Parameters related to the second axis of rotation for revolute2 joints and universal joints. Default joint position for this joint axis. Represents the x,y,z components of the axis unit vector. The axis is expressed in the joint frame unless the use_parent_model_frame flag is set to true. The vector should be normalized. Flag to interpret the axis xyz element in the parent model frame instead of joint frame. Provided for Gazebo compatibility (see https://bitbucket.org/osrf/gazebo/issue/494 ). An element specifying physical properties of the joint. These values are used to specify modeling properties of the joint, particularly useful for simulation. The physical velocity dependent viscous damping coefficient of the joint. EXPERIMENTAL: if damping coefficient is negative and implicit_spring_damper is true, adaptive damping is used. The physical static friction value of the joint. The spring reference position for this joint axis. The spring stiffness for this joint axis. An attribute specifying the lower joint limit (radians for revolute joints, meters for prismatic joints). Omit if joint is continuous. An attribute specifying the upper joint limit (radians for revolute joints, meters for prismatic joints). Omit if joint is continuous. An attribute for enforcing the maximum joint effort applied by Joint::SetForce. Limit is not enforced if value is negative. (not implemented) An attribute for enforcing the maximum joint velocity. Joint stop stiffness. Supported physics engines: SimBody. Joint stop dissipation. Supported physics engines: SimBody. Parameters that are specific to a certain physics engine. Simbody specific parameters Force cut in the multibody graph at this joint. ODE specific parameters (DEPRECATION WARNING: In SDFormat 1.5 this tag will be replaced by the same tag directly under the physics-block. For now, this tag overrides the one outside of ode-block, but in SDFormat 1.5 this tag will be removed completely.) If provide feedback is set to true, ODE will compute the constraint forces at this joint. If cfm damping is set to true, ODE will use CFM to simulate damping, allows for infinite damping, and one additional constraint row (previously used for joint limit) is always active. If implicit_spring_damper is set to true, ODE will use CFM, ERP to simulate stiffness and damping, allows for infinite damping, and one additional constraint row (previously used for joint limit) is always active. This replaces cfm_damping parameter in SDFormat 1.4. Scale the excess for in a joint motor at joint limits. Should be between zero and one. Constraint force mixing for constrained directions Error reduction parameter for constrained directions Bounciness of the limits Maximum force or torque used to reach the desired velocity. The desired velocity of the joint. Should only be set if you want the joint to move on load. Constraint force mixing parameter used by the joint stop Error reduction parameter used by the joint stop Suspension constraint force mixing parameter Suspension error reduction parameter If provide feedback is set to true, physics engine will compute the constraint forces at this joint. For now, provide_feedback under ode block will override this tag and given user warning about the migration. provide_feedback under ode is scheduled to be removed in SDFormat 1.5. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. The sensor tag describes the type and properties of a sensor. A unique name for the sensor. This name must not match another model in the model. The type name of the sensor. By default, SDFormat supports types air_pressure, altimeter, camera, contact, depth_camera, force_torque, gps, gpu_lidar, gpu_ray, imu, lidar, logical_camera, magnetometer, multicamera, ray, rfid, rfidtag, rgbd_camera, sonar, thermal_camera, wireless_receiver, and wireless_transmitter. The "ray" and "gpu_ray" types are equivalent to "lidar" and "gpu_lidar", respectively. It is preferred to use "lidar" and "gpu_lidar" since "ray" and "gpu_ray" will be deprecated. The "ray" and "gpu_ray" types are maintained for legacy support. If true the sensor will always be updated according to the update rate. The frequency at which the sensor data is generated. If left unspecified, the sensor will generate data every cycle. If true, the sensor is visualized in the GUI Name of the topic on which data is published. This is necessary for visualization A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. These elements are specific to an air pressure sensor. The initial altitude in meters. This value can be used by a sensor implementation to augment the altitude of the sensor. For example, if you are using simulation instead of creating a 1000 m mountain model on which to place your sensor, you could instead set this value to 1000 and place your model on a ground plane with a Z height of zero. Noise parameters for the pressure data. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to an altimeter sensor. Noise parameters for vertical position The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical velocity The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to camera sensors. An optional name for the camera. Horizontal field of view The image size in pixels and format. Width in pixels Height in pixels (L8|R8G8B8|B8G8R8|BAYER_RGGB8|BAYER_BGGR8|BAYER_GBRG8|BAYER_GRBG8) The near and far clip planes. Objects closer or farther than these planes are not rendered. Near clipping plane Far clipping plane Enable or disable saving of camera frames. True = saving enabled The path name which will hold the frame data. If path name is relative, then directory is relative to current working directory. Depth camera parameters Type of output The near and far clip planes. Objects closer or farther than these planes are not detected by the depth camera. Near clipping plane for depth camera Far clipping plane for depth camera The properties of the noise model that should be applied to generated images The type of noise. Currently supported types are: "gaussian" (draw additive noise values independently for each pixel from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. Lens distortion to be applied to camera images. See http://en.wikipedia.org/wiki/Distortion_(optics)#Software_correction The radial distortion coefficient k1 The radial distortion coefficient k2 The radial distortion coefficient k3 The tangential distortion coefficient p1 The tangential distortion coefficient p2 The distortion center or principal point Lens projection description Type of the lens mapping. Supported values are gnomonical, stereographic, equidistant, equisolid_angle, orthographic, custom. For gnomonical (perspective) projection, it is recommended to specify a horizontal_fov of less than or equal to 90° If true the image will be scaled to fit horizontal FOV, otherwise it will be shown according to projection type parameters Definition of custom mapping function in a form of r=c1*f*fun(theta/c2 + c3). See https://en.wikipedia.org/wiki/Fisheye_lens#Mapping_function Linear scaling constant Angle scaling constant Angle offset constant Focal length of the optical system. Note: It's not a focal length of the lens in a common sense! This value is ignored if 'scale_to_fov' is set to true Possible values are 'sin', 'tan' and 'id' Everything outside of the specified angle will be hidden, 90° by default Resolution of the environment cube map used to draw the world Camera intrinsic parameters for setting a custom perspective projection matrix (cannot be used with WideAngleCamera since this class uses image stitching from 6 different cameras for achieving a wide field of view). The focal lengths can be computed using focal_length_in_pixels = (image_width_in_pixels * 0.5) / tan(field_of_view_in_degrees * 0.5 * PI/180) X focal length (in pixels, overrides horizontal_fov) Y focal length (in pixels, overrides horizontal_fov) X principal point (in pixels) Y principal point (in pixels) XY axis skew A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. These elements are specific to the contact sensor. name of the collision element within a link that acts as the contact sensor. Topic on which contact data is published. These elements are specific to the GPS sensor. Parameters related to GPS position measurement. Noise parameters for horizontal position measurement, in units of meters. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical position measurement, in units of meters. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to GPS position measurement. Noise parameters for horizontal velocity measurement, in units of meters/second. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical velocity measurement, in units of meters/second. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the IMU sensor. This string represents special hardcoded use cases that are commonly seen with typical robot IMU's: - CUSTOM: use Euler angle custom_rpy orientation specification. The orientation of the IMU's reference frame is defined by adding the custom_rpy rotation to the parent_frame. - NED: The IMU XYZ aligns with NED, where NED orientation relative to Gazebo world is defined by the SphericalCoordinates class. - ENU: The IMU XYZ aligns with ENU, where ENU orientation relative to Gazebo world is defined by the SphericalCoordinates class. - NWU: The IMU XYZ aligns with NWU, where NWU orientation relative to Gazebo world is defined by the SphericalCoordinates class. - GRAV_UP: where direction of gravity maps to IMU reference frame Z-axis with Z-axis pointing in the opposite direction of gravity. IMU reference frame X-axis direction is defined by grav_dir_x. Note if grav_dir_x is parallel to gravity direction, this configuration fails. Otherwise, IMU reference frame X-axis is defined by projection of grav_dir_x onto a plane normal to the gravity vector. IMU reference frame Y-axis is a vector orthogonal to both X and Z axis following the right hand rule. - GRAV_DOWN: where direction of gravity maps to IMU reference frame Z-axis with Z-axis pointing in the direction of gravity. IMU reference frame X-axis direction is defined by grav_dir_x. Note if grav_dir_x is parallel to gravity direction, this configuration fails. Otherwise, IMU reference frame X-axis is defined by projection of grav_dir_x onto a plane normal to the gravity vector. IMU reference frame Y-axis is a vector orthogonal to both X and Z axis following the right hand rule. This field and parent_frame are used when localization is set to CUSTOM. Orientation (fixed axis roll, pitch yaw) transform from parent_frame to this IMU's reference frame. Some common examples are: - IMU reports in its local frame on boot. IMU sensor frame is the reference frame. Example: parent_frame="", custom_rpy="0 0 0" - IMU reports in Gazebo world frame. Example sdf: parent_frame="world", custom_rpy="0 0 0" - IMU reports in NWU frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between North-West-Up and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="0 0 0" - IMU reports in NED frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between North-East-Down and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="M_PI 0 0" - IMU reports in ENU frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between East-North-Up and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="0 0 -0.5*M_PI" - IMU reports in ROS optical frame as described in http://www.ros.org/reps/rep-0103.html#suffix-frames, which is (z-forward, x-left to right when facing +z, y-top to bottom when facing +z). (default gazebo camera is +x:view direction, +y:left, +z:up). Example sdf: parent_frame="local", custom_rpy="-0.5*M_PI 0 -0.5*M_PI" Name of parent frame which the custom_rpy transform is defined relative to. It can be any valid fully scoped Gazebo Link name or the special reserved "world" frame. If left empty, use the sensor's own local frame. Used when localization is set to GRAV_UP or GRAV_DOWN, a projection of this vector into a plane that is orthogonal to the gravity vector defines the direction of the IMU reference frame's X-axis. grav_dir_x is defined in the coordinate frame as defined by the parent_frame element. Name of parent frame in which the grav_dir_x vector is defined. It can be any valid fully scoped Gazebo Link name or the special reserved "world" frame. If left empty, use the sensor's own local frame. Topic on which data is published. DEPRECATED. Use the topic element that is a child of the sensor element. These elements are specific to body-frame angular velocity, which is expressed in radians per second Angular velocity about the X axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Angular velocity about the Y axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Angular velocity about the Z axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to body-frame linear acceleration, which is expressed in meters per second squared Linear acceleration about the X axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Linear acceleration about the Y axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Linear acceleration about the Z axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the lidar sensor. The number of simulated lidar rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle The number of simulated lidar rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle specifies range properties of each simulated lidar The minimum distance for each lidar ray. The maximum distance for each lidar ray. Linear resolution of each lidar ray. The properties of the noise model that should be applied to generated scans The type of noise. Currently supported types are: "gaussian" (draw noise values independently for each beam from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. These elements are specific to logical camera sensors. A logical camera reports objects that fall within a frustum. Computation should be performed on the CPU. Near clipping distance of the view frustum Far clipping distance of the view frustum Aspect ratio of the near and far planes. This is the width divided by the height of the near or far planes. Horizontal field of view of the frustum, in radians. This is the angle between the frustum's vertex and the edges of the near or far plane. These elements are specific to a Magnetometer sensor. Parameters related to the body-frame X axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to the body-frame Y axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to the body-frame Z axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the ray (laser) sensor. The number of simulated rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle The number of simulated rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle specifies range properties of each simulated ray The minimum distance for each ray. The maximum distance for each ray. Linear resolution of each ray. The properties of the noise model that should be applied to generated scans The type of noise. Currently supported types are: "gaussian" (draw noise values independently for each beam from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. These elements are specific to the sonar sensor. The sonar collision shape. Currently supported geometries are: "cone" and "sphere". Minimum range Max range Radius of the sonar cone at max range. This parameter is only used if geometry is "cone". These elements are specific to a wireless transceiver. Service set identifier (network name) Specifies the frequency of transmission in MHz Only a frequency range is filtered. Here we set the lower bound (MHz). Only a frequency range is filtered. Here we set the upper bound (MHz). Specifies the antenna gain in dBi Specifies the transmission power in dBm Mininum received signal power in dBm These elements are specific to the force torque sensor. Frame in which to report the wrench values. Currently supported frames are: "parent" report the wrench expressed in the orientation of the parent link frame, "child" report the wrench expressed in the orientation of the child link frame, "sensor" report the wrench expressed in the orientation of the joint sensor frame. Note that for each option the point with respect to which the torque component of the wrench is expressed is the joint origin. Direction of the wrench measured by the sensor. The supported options are: "parent_to_child" if the measured wrench is the one applied by parent link on the child link, "child_to_parent" if the measured wrench is the one applied by the child link on the parent link. Name of the road Width of the road A series of points that define the path of the road. The material of the visual element. Name of material from an installed script file. This will override the color element if the script exists. URI of the material script file Name of the script within the script file vertex, pixel, normal_map_objectspace, normal_map_tangentspace filename of the normal map If false, dynamic lighting will be disabled The ambient color of a material specified by set of four numbers representing red/green/blue, each in the range of [0,1]. The diffuse color of a material specified by set of four numbers representing red/green/blue/alpha, each in the range of [0,1]. The specular color of a material specified by set of four numbers representing red/green/blue/alpha, each in the range of [0,1]. The emissive color of a material specified by set of four numbers representing red/green/blue, each in the range of [0,1]. Physically Based Rendering (PBR) material. There are two PBR workflows: metal and specular. While both workflows and their parameters can be specified at the same time, typically only one of them will be used (depending on the underlying renderer capability). It is also recommended to use the same workflow for all materials in the world. PBR using the Metallic/Roughness workflow. Filename of the diffuse/albedo map. Filename of the roughness map. Material roughness in the range of [0,1], where 0 represents a smooth surface and 1 represents a rough surface. This is the inverse of a specular map in a PBR specular workflow. Filename of the metalness map. Material metalness in the range of [0,1], where 0 represents non-metal and 1 represents raw metal Filename of the environment / reflection map, typically in the form of a cubemap Filename of the ambient occlusion map. The map defines the amount of ambient lighting on the surface. Filename of the normal map. The normals can be in the object space or tangent space as specified in the 'type' attribute The space that the normals are in. Values are: 'object' or 'tangent' Filename of the emissive map. PBR using the Specular/Glossiness workflow. Filename of the diffuse/albedo map. Filename of the specular map. Filename of the glossiness map. Material glossiness in the range of [0-1], where 0 represents a rough surface and 1 represents a smooth surface. This is the inverse of a roughness map in a PBR metal workflow. Filename of the ambient occlusion map. The map defines the amount of ambient lighting on the surface. Filename of the normal map. The normals can be in the object space or tangent space as specified in the 'type' attribute The space that the normals are in. Values are: 'object' or 'tangent' Filename of the emissive map. Name of planetary surface model, used to determine the surface altitude at a given latitude and longitude. The default is an ellipsoid model of the earth based on the WGS-84 standard. It is used in Gazebo's GPS sensor implementation. This field identifies how Gazebo world frame is aligned in Geographical sense. The final Gazebo world frame orientation is obtained by rotating a frame aligned with following notation by the field heading_deg (Note that heading_deg corresponds to positive yaw rotation in the NED frame, so it's inverse specifies positive Z-rotation in ENU or NWU). Options are: - ENU (East-North-Up) - NED (North-East-Down) - NWU (North-West-Up) For example, world frame specified by setting world_orientation="ENU" and heading_deg=-90° is effectively equivalent to NWU with heading of 0°. Geodetic latitude at origin of gazebo reference frame, specified in units of degrees. Longitude at origin of gazebo reference frame, specified in units of degrees. Elevation of origin of gazebo reference frame, specified in meters. Heading offset of gazebo reference frame, measured as angle between Gazebo world frame and the world_frame_orientation type (ENU/NED/NWU). Rotations about the downward-vector (e.g. North to East) are positive. The direction of rotation is chosen to be consistent with compass heading convention (e.g. 0 degrees points North and 90 degrees points East, positive rotation indicates counterclockwise rotation when viewed from top-down direction). The angle is specified in degrees. Name of the world this state applies to Simulation time stamp of the state [seconds nanoseconds] Wall time stamp of the state [seconds nanoseconds] Real time stamp of the state [seconds nanoseconds] Number of simulation iterations. A list containing the entire description of entities inserted. The model element defines a complete robot or any other physical object. A unique name for the model. This name must not match another model in the world. If set to true, the model is immovable. Otherwise the model is simulated in the dynamics engine. If set to true, all links in the model will collide with each other (except those connected by a joint). Can be overridden by the link or collision element self_collide property. Two links within a model will collide if link1.self_collide OR link2.self_collide. Links connected by a joint will never collide. Allows a model to auto-disable, which is means the physics engine can skip updating the model when the model is at rest. This parameter is only used by models with no joints. Include resources from a URI. This can be used to nest models. URI to a resource, such as a model Override the pose of the included model. A position and orientation in the global coordinate frame for the model. Position(x,y,z) and rotation (roll, pitch yaw) in the global coordinate frame. Override the name of the included model. Override the static value of the included model. A nested model element A unique name for the model. This name must not match another nested model in the same level as this model. If set to true, all links in the model will be affected by the wind. Can be overriden by the link wind property. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A physical link with inertia, collision, and visual properties. A link must be a child of a model, and any number of links may exist in a model. A unique name for the link within the scope of the model. If true, the link is affected by gravity. If true, the link is affected by the wind. If true, the link can collide with other links in the model. Two links within a model will collide if link1.self_collide OR link2.self_collide. Links connected by a joint will never collide. If true, the link is kinematic only If true, the link will have 6DOF and be a direct child of world. Exponential damping of the link's velocity. Linear damping Angular damping A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. The inertial properties of the link. The mass of the link. The 3x3 rotational inertia matrix. Because the rotational inertia matrix is symmetric, only 6 above-diagonal elements of this matrix are specified here, using the attributes ixx, ixy, ixz, iyy, iyz, izz. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. This is the pose of the inertial reference frame, relative to the specified reference frame. The origin of the inertial reference frame needs to be at the center of gravity. The axes of the inertial reference frame do not need to be aligned with the principal axes of the inertia. Name of frame which the pose is defined relative to. The collision properties of a link. Note that this can be different from the visual properties of a link, for example, simpler collision models are often used to reduce computation time. Unique name for the collision element within the scope of the parent link. intensity value returned by laser sensor. Maximum number of contacts allowed between two entities. This value overrides the max_contacts element defined in physics. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. The shape of the visual or collision object. You can use the empty tag to make empty geometries. Box shape The three side lengths of the box. The origin of the box is in its geometric center (inside the center of the box). Cylinder shape Radius of the cylinder Length of the cylinder A heightmap based on a 2d grayscale image. URI to a grayscale image file The size of the heightmap in world units. When loading an image: "size" is used if present, otherwise defaults to 1x1x1. When loading a DEM: "size" is used if present, otherwise defaults to true size of DEM. A position offset. The heightmap can contain multiple textures. The order of the texture matters. The first texture will appear at the lowest height, and the last texture at the highest height. Use blend to control the height thresholds and fade between textures. Size of the applied texture in meters. Diffuse texture image filename Normalmap texture image filename The blend tag controls how two adjacent textures are mixed. The number of blend elements should equal one less than the number of textures. Min height of a blend layer Distance over which the blend occurs Set if the rendering engine will use terrain paging Samples per heightmap datum. For rasterized heightmaps, this indicates the number of samples to take per pixel. Using a lower value, e.g. 1, will generally improve the performance of the heightmap but lower the heightmap quality. Extrude a set of boxes from a grayscale image. URI of the grayscale image file Scaling factor applied to the image Grayscale threshold Height of the extruded boxes The amount of error in the model Mesh shape Mesh uri Use a named submesh. The submesh must exist in the mesh specified by the uri Name of the submesh within the parent mesh Set to true to center the vertices of the submesh at 0,0,0. This will effectively remove any transformations on the submesh before the poses from parent links and models are applied. Scaling factor applied to the mesh Plane shape Normal direction for the plane. When a Plane is used as a geometry for a Visual or Collision object, then the normal is specified in the Visual or Collision frame, respectively. Length of each side of the plane. Note that this property is meaningful only for visualizing the Plane, i.e., when the Plane is used as a geometry for a Visual object. The Plane has infinite size when used as a geometry for a Collision object. Defines an extruded polyline shape A series of points that define the path of the polyline. Height of the polyline Sphere shape radius of the sphere The surface parameters Bounciness coefficient of restitution, from [0...1], where 0=no bounciness. Bounce capture velocity, below which effective coefficient of restitution is 0. Parameters for torsional friction Torsional friction coefficient, unitless maximum ratio of tangential stress to normal stress. If this flag is true, torsional friction is calculated using the "patch_radius" parameter. If this flag is set to false, "surface_radius" (R) and contact depth (d) are used to compute the patch radius as sqrt(R*d). Radius of contact patch surface. Surface radius on the point of contact. Torsional friction parameters for ODE Force dependent slip for torsional friction, equivalent to inverse of viscous damping coefficient with units of rad/s/(Nm). A slip value of 0 is infinitely viscous. ODE friction parameters Coefficient of friction in first friction pyramid direction, the unitless maximum ratio of force in first friction pyramid direction to normal force. Coefficient of friction in second friction pyramid direction, the unitless maximum ratio of force in second friction pyramid direction to normal force. Unit vector specifying first friction pyramid direction in collision-fixed reference frame. If the friction pyramid model is in use, and this value is set to a unit vector for one of the colliding surfaces, the ODE Collide callback function will align the friction pyramid directions with a reference frame fixed to that collision surface. If both surfaces have this value set to a vector of zeros, the friction pyramid directions will be aligned with the world frame. If this value is set for both surfaces, the behavior is undefined. Force dependent slip in first friction pyramid direction, equivalent to inverse of viscous damping coefficient with units of m/s/N. A slip value of 0 is infinitely viscous. Force dependent slip in second friction pyramid direction, equivalent to inverse of viscous damping coefficient with units of m/s/N. A slip value of 0 is infinitely viscous. Coefficient of friction in first friction pyramid direction, the unitless maximum ratio of force in first friction pyramid direction to normal force. Coefficient of friction in second friction pyramid direction, the unitless maximum ratio of force in second friction pyramid direction to normal force. Unit vector specifying first friction pyramid direction in collision-fixed reference frame. If the friction pyramid model is in use, and this value is set to a unit vector for one of the colliding surfaces, the friction pyramid directions will be aligned with a reference frame fixed to that collision surface. If both surfaces have this value set to a vector of zeros, the friction pyramid directions will be aligned with the world frame. If this value is set for both surfaces, the behavior is undefined. Coefficient of rolling friction Flag to disable contact force generation, while still allowing collision checks and contact visualization to occur. Bitmask for collision filtering when collide_without_contact is on Bitmask for collision filtering. This will override collide_without_contact Bitmask for category of collision filtering. Collision happens if ((category1 & collision2) | (category2 & collision1)) is not zero. If not specified, the category_bitmask should be interpreted as being the same as collide_bitmask. Poisson's ratio is the unitless ratio between transverse and axial strain. This value must lie between (-1, 0.5). Defaults to 0.3 for typical steel. Note typical silicone elastomers have Poisson's ratio near 0.49 ~ 0.50. For reference, approximate values for Material:(Young's Modulus, Poisson's Ratio) for some of the typical materials are: Plastic: (1e8 ~ 3e9 Pa, 0.35 ~ 0.41), Wood: (4e9 ~ 1e10 Pa, 0.22 ~ 0.50), Aluminum: (7e10 Pa, 0.32 ~ 0.35), Steel: (2e11 Pa, 0.26 ~ 0.31). Young's Modulus in SI derived unit Pascal. Defaults to -1. If value is less or equal to zero, contact using elastic modulus (with Poisson's Ratio) is disabled. For reference, approximate values for Material:(Young's Modulus, Poisson's Ratio) for some of the typical materials are: Plastic: (1e8 ~ 3e9 Pa, 0.35 ~ 0.41), Wood: (4e9 ~ 1e10 Pa, 0.22 ~ 0.50), Aluminum: (7e10 Pa, 0.32 ~ 0.35), Steel: (2e11 Pa, 0.26 ~ 0.31). ODE contact parameters Soft constraint force mixing. Soft error reduction parameter dynamically "stiffness"-equivalent coefficient for contact joints dynamically "damping"-equivalent coefficient for contact joints maximum contact correction velocity truncation term. minimum allowable depth before contact correction impulse is applied Bullet contact parameters Soft constraint force mixing. Soft error reduction parameter dynamically "stiffness"-equivalent coefficient for contact joints dynamically "damping"-equivalent coefficient for contact joints Similar to ODE's max_vel implementation. See http://bulletphysics.org/mediawiki-1.5.8/index.php/BtContactSolverInfo#Split_Impulse for more information. Similar to ODE's max_vel implementation. See http://bulletphysics.org/mediawiki-1.5.8/index.php/BtContactSolverInfo#Split_Impulse for more information. soft contact pamameters based on paper: http://www.cc.gatech.edu/graphics/projects/Sumit/homepage/papers/sigasia11/jain_softcontacts_siga11.pdf This is variable k_v in the soft contacts paper. Its unit is N/m. This is variable k_e in the soft contacts paper. Its unit is N/m. Viscous damping of point velocity in body frame. Its unit is N/m/s. Fraction of mass to be distributed among deformable nodes. The visual properties of the link. This element specifies the shape of the object (box, cylinder, etc.) for visualization purposes. Unique name for the visual element within the scope of the parent link. If true the visual will cast shadows. will be implemented in the future release. The amount of transparency( 0=opaque, 1 = fully transparent) Optional meta information for the visual. The information contained within this element should be used to provide additional feedback to an end user. The layer in which this visual is displayed. The layer number is useful for programs, such as Gazebo, that put visuals in different layers for enhanced visualization. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. The material of the visual element. Name of material from an installed script file. This will override the color element if the script exists. URI of the material script file Name of the script within the script file vertex, pixel, normal_map_objectspace, normal_map_tangentspace filename of the normal map If false, dynamic lighting will be disabled The ambient color of a material specified by set of four numbers representing red/green/blue, each in the range of [0,1]. The diffuse color of a material specified by set of four numbers representing red/green/blue/alpha, each in the range of [0,1]. The specular color of a material specified by set of four numbers representing red/green/blue/alpha, each in the range of [0,1]. The emissive color of a material specified by set of four numbers representing red/green/blue, each in the range of [0,1]. Physically Based Rendering (PBR) material. There are two PBR workflows: metal and specular. While both workflows and their parameters can be specified at the same time, typically only one of them will be used (depending on the underlying renderer capability). It is also recommended to use the same workflow for all materials in the world. PBR using the Metallic/Roughness workflow. Filename of the diffuse/albedo map. Filename of the roughness map. Material roughness in the range of [0,1], where 0 represents a smooth surface and 1 represents a rough surface. This is the inverse of a specular map in a PBR specular workflow. Filename of the metalness map. Material metalness in the range of [0,1], where 0 represents non-metal and 1 represents raw metal Filename of the environment / reflection map, typically in the form of a cubemap Filename of the ambient occlusion map. The map defines the amount of ambient lighting on the surface. Filename of the normal map. The normals can be in the object space or tangent space as specified in the 'type' attribute The space that the normals are in. Values are: 'object' or 'tangent' Filename of the emissive map. PBR using the Specular/Glossiness workflow. Filename of the diffuse/albedo map. Filename of the specular map. Filename of the glossiness map. Material glossiness in the range of [0-1], where 0 represents a rough surface and 1 represents a smooth surface. This is the inverse of a roughness map in a PBR metal workflow. Filename of the ambient occlusion map. The map defines the amount of ambient lighting on the surface. Filename of the normal map. The normals can be in the object space or tangent space as specified in the 'type' attribute The space that the normals are in. Values are: 'object' or 'tangent' Filename of the emissive map. The shape of the visual or collision object. You can use the empty tag to make empty geometries. Box shape The three side lengths of the box. The origin of the box is in its geometric center (inside the center of the box). Cylinder shape Radius of the cylinder Length of the cylinder A heightmap based on a 2d grayscale image. URI to a grayscale image file The size of the heightmap in world units. When loading an image: "size" is used if present, otherwise defaults to 1x1x1. When loading a DEM: "size" is used if present, otherwise defaults to true size of DEM. A position offset. The heightmap can contain multiple textures. The order of the texture matters. The first texture will appear at the lowest height, and the last texture at the highest height. Use blend to control the height thresholds and fade between textures. Size of the applied texture in meters. Diffuse texture image filename Normalmap texture image filename The blend tag controls how two adjacent textures are mixed. The number of blend elements should equal one less than the number of textures. Min height of a blend layer Distance over which the blend occurs Set if the rendering engine will use terrain paging Samples per heightmap datum. For rasterized heightmaps, this indicates the number of samples to take per pixel. Using a lower value, e.g. 1, will generally improve the performance of the heightmap but lower the heightmap quality. Extrude a set of boxes from a grayscale image. URI of the grayscale image file Scaling factor applied to the image Grayscale threshold Height of the extruded boxes The amount of error in the model Mesh shape Mesh uri Use a named submesh. The submesh must exist in the mesh specified by the uri Name of the submesh within the parent mesh Set to true to center the vertices of the submesh at 0,0,0. This will effectively remove any transformations on the submesh before the poses from parent links and models are applied. Scaling factor applied to the mesh Plane shape Normal direction for the plane. When a Plane is used as a geometry for a Visual or Collision object, then the normal is specified in the Visual or Collision frame, respectively. Length of each side of the plane. Note that this property is meaningful only for visualizing the Plane, i.e., when the Plane is used as a geometry for a Visual object. The Plane has infinite size when used as a geometry for a Collision object. Defines an extruded polyline shape A series of points that define the path of the polyline. Height of the polyline Sphere shape radius of the sphere A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. The sensor tag describes the type and properties of a sensor. A unique name for the sensor. This name must not match another model in the model. The type name of the sensor. By default, SDFormat supports types air_pressure, altimeter, camera, contact, depth_camera, force_torque, gps, gpu_lidar, gpu_ray, imu, lidar, logical_camera, magnetometer, multicamera, ray, rfid, rfidtag, rgbd_camera, sonar, thermal_camera, wireless_receiver, and wireless_transmitter. The "ray" and "gpu_ray" types are equivalent to "lidar" and "gpu_lidar", respectively. It is preferred to use "lidar" and "gpu_lidar" since "ray" and "gpu_ray" will be deprecated. The "ray" and "gpu_ray" types are maintained for legacy support. If true the sensor will always be updated according to the update rate. The frequency at which the sensor data is generated. If left unspecified, the sensor will generate data every cycle. If true, the sensor is visualized in the GUI Name of the topic on which data is published. This is necessary for visualization A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. These elements are specific to an air pressure sensor. The initial altitude in meters. This value can be used by a sensor implementation to augment the altitude of the sensor. For example, if you are using simulation instead of creating a 1000 m mountain model on which to place your sensor, you could instead set this value to 1000 and place your model on a ground plane with a Z height of zero. Noise parameters for the pressure data. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to an altimeter sensor. Noise parameters for vertical position The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical velocity The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to camera sensors. An optional name for the camera. Horizontal field of view The image size in pixels and format. Width in pixels Height in pixels (L8|R8G8B8|B8G8R8|BAYER_RGGB8|BAYER_BGGR8|BAYER_GBRG8|BAYER_GRBG8) The near and far clip planes. Objects closer or farther than these planes are not rendered. Near clipping plane Far clipping plane Enable or disable saving of camera frames. True = saving enabled The path name which will hold the frame data. If path name is relative, then directory is relative to current working directory. Depth camera parameters Type of output The near and far clip planes. Objects closer or farther than these planes are not detected by the depth camera. Near clipping plane for depth camera Far clipping plane for depth camera The properties of the noise model that should be applied to generated images The type of noise. Currently supported types are: "gaussian" (draw additive noise values independently for each pixel from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. Lens distortion to be applied to camera images. See http://en.wikipedia.org/wiki/Distortion_(optics)#Software_correction The radial distortion coefficient k1 The radial distortion coefficient k2 The radial distortion coefficient k3 The tangential distortion coefficient p1 The tangential distortion coefficient p2 The distortion center or principal point Lens projection description Type of the lens mapping. Supported values are gnomonical, stereographic, equidistant, equisolid_angle, orthographic, custom. For gnomonical (perspective) projection, it is recommended to specify a horizontal_fov of less than or equal to 90° If true the image will be scaled to fit horizontal FOV, otherwise it will be shown according to projection type parameters Definition of custom mapping function in a form of r=c1*f*fun(theta/c2 + c3). See https://en.wikipedia.org/wiki/Fisheye_lens#Mapping_function Linear scaling constant Angle scaling constant Angle offset constant Focal length of the optical system. Note: It's not a focal length of the lens in a common sense! This value is ignored if 'scale_to_fov' is set to true Possible values are 'sin', 'tan' and 'id' Everything outside of the specified angle will be hidden, 90° by default Resolution of the environment cube map used to draw the world Camera intrinsic parameters for setting a custom perspective projection matrix (cannot be used with WideAngleCamera since this class uses image stitching from 6 different cameras for achieving a wide field of view). The focal lengths can be computed using focal_length_in_pixels = (image_width_in_pixels * 0.5) / tan(field_of_view_in_degrees * 0.5 * PI/180) X focal length (in pixels, overrides horizontal_fov) Y focal length (in pixels, overrides horizontal_fov) X principal point (in pixels) Y principal point (in pixels) XY axis skew A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. These elements are specific to the contact sensor. name of the collision element within a link that acts as the contact sensor. Topic on which contact data is published. These elements are specific to the GPS sensor. Parameters related to GPS position measurement. Noise parameters for horizontal position measurement, in units of meters. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical position measurement, in units of meters. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to GPS position measurement. Noise parameters for horizontal velocity measurement, in units of meters/second. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical velocity measurement, in units of meters/second. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the IMU sensor. This string represents special hardcoded use cases that are commonly seen with typical robot IMU's: - CUSTOM: use Euler angle custom_rpy orientation specification. The orientation of the IMU's reference frame is defined by adding the custom_rpy rotation to the parent_frame. - NED: The IMU XYZ aligns with NED, where NED orientation relative to Gazebo world is defined by the SphericalCoordinates class. - ENU: The IMU XYZ aligns with ENU, where ENU orientation relative to Gazebo world is defined by the SphericalCoordinates class. - NWU: The IMU XYZ aligns with NWU, where NWU orientation relative to Gazebo world is defined by the SphericalCoordinates class. - GRAV_UP: where direction of gravity maps to IMU reference frame Z-axis with Z-axis pointing in the opposite direction of gravity. IMU reference frame X-axis direction is defined by grav_dir_x. Note if grav_dir_x is parallel to gravity direction, this configuration fails. Otherwise, IMU reference frame X-axis is defined by projection of grav_dir_x onto a plane normal to the gravity vector. IMU reference frame Y-axis is a vector orthogonal to both X and Z axis following the right hand rule. - GRAV_DOWN: where direction of gravity maps to IMU reference frame Z-axis with Z-axis pointing in the direction of gravity. IMU reference frame X-axis direction is defined by grav_dir_x. Note if grav_dir_x is parallel to gravity direction, this configuration fails. Otherwise, IMU reference frame X-axis is defined by projection of grav_dir_x onto a plane normal to the gravity vector. IMU reference frame Y-axis is a vector orthogonal to both X and Z axis following the right hand rule. This field and parent_frame are used when localization is set to CUSTOM. Orientation (fixed axis roll, pitch yaw) transform from parent_frame to this IMU's reference frame. Some common examples are: - IMU reports in its local frame on boot. IMU sensor frame is the reference frame. Example: parent_frame="", custom_rpy="0 0 0" - IMU reports in Gazebo world frame. Example sdf: parent_frame="world", custom_rpy="0 0 0" - IMU reports in NWU frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between North-West-Up and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="0 0 0" - IMU reports in NED frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between North-East-Down and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="M_PI 0 0" - IMU reports in ENU frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between East-North-Up and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="0 0 -0.5*M_PI" - IMU reports in ROS optical frame as described in http://www.ros.org/reps/rep-0103.html#suffix-frames, which is (z-forward, x-left to right when facing +z, y-top to bottom when facing +z). (default gazebo camera is +x:view direction, +y:left, +z:up). Example sdf: parent_frame="local", custom_rpy="-0.5*M_PI 0 -0.5*M_PI" Name of parent frame which the custom_rpy transform is defined relative to. It can be any valid fully scoped Gazebo Link name or the special reserved "world" frame. If left empty, use the sensor's own local frame. Used when localization is set to GRAV_UP or GRAV_DOWN, a projection of this vector into a plane that is orthogonal to the gravity vector defines the direction of the IMU reference frame's X-axis. grav_dir_x is defined in the coordinate frame as defined by the parent_frame element. Name of parent frame in which the grav_dir_x vector is defined. It can be any valid fully scoped Gazebo Link name or the special reserved "world" frame. If left empty, use the sensor's own local frame. Topic on which data is published. DEPRECATED. Use the topic element that is a child of the sensor element. These elements are specific to body-frame angular velocity, which is expressed in radians per second Angular velocity about the X axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Angular velocity about the Y axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Angular velocity about the Z axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to body-frame linear acceleration, which is expressed in meters per second squared Linear acceleration about the X axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Linear acceleration about the Y axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Linear acceleration about the Z axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the lidar sensor. The number of simulated lidar rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle The number of simulated lidar rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle specifies range properties of each simulated lidar The minimum distance for each lidar ray. The maximum distance for each lidar ray. Linear resolution of each lidar ray. The properties of the noise model that should be applied to generated scans The type of noise. Currently supported types are: "gaussian" (draw noise values independently for each beam from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. These elements are specific to logical camera sensors. A logical camera reports objects that fall within a frustum. Computation should be performed on the CPU. Near clipping distance of the view frustum Far clipping distance of the view frustum Aspect ratio of the near and far planes. This is the width divided by the height of the near or far planes. Horizontal field of view of the frustum, in radians. This is the angle between the frustum's vertex and the edges of the near or far plane. These elements are specific to a Magnetometer sensor. Parameters related to the body-frame X axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to the body-frame Y axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to the body-frame Z axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the ray (laser) sensor. The number of simulated rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle The number of simulated rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle specifies range properties of each simulated ray The minimum distance for each ray. The maximum distance for each ray. Linear resolution of each ray. The properties of the noise model that should be applied to generated scans The type of noise. Currently supported types are: "gaussian" (draw noise values independently for each beam from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. These elements are specific to the sonar sensor. The sonar collision shape. Currently supported geometries are: "cone" and "sphere". Minimum range Max range Radius of the sonar cone at max range. This parameter is only used if geometry is "cone". These elements are specific to a wireless transceiver. Service set identifier (network name) Specifies the frequency of transmission in MHz Only a frequency range is filtered. Here we set the lower bound (MHz). Only a frequency range is filtered. Here we set the upper bound (MHz). Specifies the antenna gain in dBi Specifies the transmission power in dBm Mininum received signal power in dBm These elements are specific to the force torque sensor. Frame in which to report the wrench values. Currently supported frames are: "parent" report the wrench expressed in the orientation of the parent link frame, "child" report the wrench expressed in the orientation of the child link frame, "sensor" report the wrench expressed in the orientation of the joint sensor frame. Note that for each option the point with respect to which the torque component of the wrench is expressed is the joint origin. Direction of the wrench measured by the sensor. The supported options are: "parent_to_child" if the measured wrench is the one applied by parent link on the child link, "child_to_parent" if the measured wrench is the one applied by the child link on the parent link. Name of the projector Texture name Field of view Near clip distance far clip distance A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. An audio sink. An audio source. URI of the audio media. Pitch for the audio media, in Hz Gain for the audio media, in dB. List of collision objects that will trigger audio playback. Name of child collision element that will trigger audio playback. True to make the audio source loop playback. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. Description of a battery. Unique name for the battery. Initial voltage in volts. The light element describes a light source. A unique name for the light. The light type: point, directional, spot. When true, the light will cast shadows. Diffuse light color Specular light color Light attenuation Range of the light The linear attenuation factor: 1 means attenuate evenly over the distance. The constant attenuation factor: 1.0 means never attenuate, 0.0 is complete attenutation. The quadratic attenuation factor: adds a curvature to the attenuation. Direction of the light, only applicable for spot and directional lights. Spot light parameters Angle covered by the bright inner cone Angle covered by the outer cone The rate of falloff between the inner and outer cones. 1.0 means a linear falloff, less means slower falloff, higher means faster falloff. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A joint connects two links with kinematic and dynamic properties. By default, the pose of a joint is expressed in the child link frame. A unique name for the joint within the scope of the model. The type of joint, which must be one of the following: (continuous) a hinge joint that rotates on a single axis with a continuous range of motion, (revolute) a hinge joint that rotates on a single axis with a fixed range of motion, (gearbox) geared revolute joints, (revolute2) same as two revolute joints connected in series, (prismatic) a sliding joint that slides along an axis with a limited range specified by upper and lower limits, (ball) a ball and socket joint, (screw) a single degree of freedom joint with coupled sliding and rotational motion, (universal) like a ball joint, but constrains one degree of freedom, (fixed) a joint with zero degrees of freedom that rigidly connects two links. Name of the parent link Name of the child link Parameter for gearbox joints. Given theta_1 and theta_2 defined in description for gearbox_reference_body, theta_2 = -gearbox_ratio * theta_1. Parameter for gearbox joints. Gearbox ratio is enforced over two joint angles. First joint angle (theta_1) is the angle from the gearbox_reference_body to the parent link in the direction of the axis element and the second joint angle (theta_2) is the angle from the gearbox_reference_body to the child link in the direction of the axis2 element. Parameter for screw joints. Parameters related to the axis of rotation for revolute joints, the axis of translation for prismatic joints. Default joint position for this joint axis. Represents the x,y,z components of the axis unit vector. The axis is expressed in the joint frame unless the use_parent_model_frame flag is set to true. The vector should be normalized. Flag to interpret the axis xyz element in the parent model frame instead of joint frame. Provided for Gazebo compatibility (see https://bitbucket.org/osrf/gazebo/issue/494 ). An element specifying physical properties of the joint. These values are used to specify modeling properties of the joint, particularly useful for simulation. The physical velocity dependent viscous damping coefficient of the joint. The physical static friction value of the joint. The spring reference position for this joint axis. The spring stiffness for this joint axis. specifies the limits of this joint Specifies the lower joint limit (radians for revolute joints, meters for prismatic joints). Omit if joint is continuous. Specifies the upper joint limit (radians for revolute joints, meters for prismatic joints). Omit if joint is continuous. A value for enforcing the maximum joint effort applied. Limit is not enforced if value is negative. A value for enforcing the maximum joint velocity. Joint stop stiffness. Joint stop dissipation. Parameters related to the second axis of rotation for revolute2 joints and universal joints. Default joint position for this joint axis. Represents the x,y,z components of the axis unit vector. The axis is expressed in the joint frame unless the use_parent_model_frame flag is set to true. The vector should be normalized. Flag to interpret the axis xyz element in the parent model frame instead of joint frame. Provided for Gazebo compatibility (see https://bitbucket.org/osrf/gazebo/issue/494 ). An element specifying physical properties of the joint. These values are used to specify modeling properties of the joint, particularly useful for simulation. The physical velocity dependent viscous damping coefficient of the joint. EXPERIMENTAL: if damping coefficient is negative and implicit_spring_damper is true, adaptive damping is used. The physical static friction value of the joint. The spring reference position for this joint axis. The spring stiffness for this joint axis. An attribute specifying the lower joint limit (radians for revolute joints, meters for prismatic joints). Omit if joint is continuous. An attribute specifying the upper joint limit (radians for revolute joints, meters for prismatic joints). Omit if joint is continuous. An attribute for enforcing the maximum joint effort applied by Joint::SetForce. Limit is not enforced if value is negative. (not implemented) An attribute for enforcing the maximum joint velocity. Joint stop stiffness. Supported physics engines: SimBody. Joint stop dissipation. Supported physics engines: SimBody. Parameters that are specific to a certain physics engine. Simbody specific parameters Force cut in the multibody graph at this joint. ODE specific parameters (DEPRECATION WARNING: In SDFormat 1.5 this tag will be replaced by the same tag directly under the physics-block. For now, this tag overrides the one outside of ode-block, but in SDFormat 1.5 this tag will be removed completely.) If provide feedback is set to true, ODE will compute the constraint forces at this joint. If cfm damping is set to true, ODE will use CFM to simulate damping, allows for infinite damping, and one additional constraint row (previously used for joint limit) is always active. If implicit_spring_damper is set to true, ODE will use CFM, ERP to simulate stiffness and damping, allows for infinite damping, and one additional constraint row (previously used for joint limit) is always active. This replaces cfm_damping parameter in SDFormat 1.4. Scale the excess for in a joint motor at joint limits. Should be between zero and one. Constraint force mixing for constrained directions Error reduction parameter for constrained directions Bounciness of the limits Maximum force or torque used to reach the desired velocity. The desired velocity of the joint. Should only be set if you want the joint to move on load. Constraint force mixing parameter used by the joint stop Error reduction parameter used by the joint stop Suspension constraint force mixing parameter Suspension error reduction parameter If provide feedback is set to true, physics engine will compute the constraint forces at this joint. For now, provide_feedback under ode block will override this tag and given user warning about the migration. provide_feedback under ode is scheduled to be removed in SDFormat 1.5. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. The sensor tag describes the type and properties of a sensor. A unique name for the sensor. This name must not match another model in the model. The type name of the sensor. By default, SDFormat supports types air_pressure, altimeter, camera, contact, depth_camera, force_torque, gps, gpu_lidar, gpu_ray, imu, lidar, logical_camera, magnetometer, multicamera, ray, rfid, rfidtag, rgbd_camera, sonar, thermal_camera, wireless_receiver, and wireless_transmitter. The "ray" and "gpu_ray" types are equivalent to "lidar" and "gpu_lidar", respectively. It is preferred to use "lidar" and "gpu_lidar" since "ray" and "gpu_ray" will be deprecated. The "ray" and "gpu_ray" types are maintained for legacy support. If true the sensor will always be updated according to the update rate. The frequency at which the sensor data is generated. If left unspecified, the sensor will generate data every cycle. If true, the sensor is visualized in the GUI Name of the topic on which data is published. This is necessary for visualization A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. These elements are specific to an air pressure sensor. The initial altitude in meters. This value can be used by a sensor implementation to augment the altitude of the sensor. For example, if you are using simulation instead of creating a 1000 m mountain model on which to place your sensor, you could instead set this value to 1000 and place your model on a ground plane with a Z height of zero. Noise parameters for the pressure data. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to an altimeter sensor. Noise parameters for vertical position The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical velocity The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to camera sensors. An optional name for the camera. Horizontal field of view The image size in pixels and format. Width in pixels Height in pixels (L8|R8G8B8|B8G8R8|BAYER_RGGB8|BAYER_BGGR8|BAYER_GBRG8|BAYER_GRBG8) The near and far clip planes. Objects closer or farther than these planes are not rendered. Near clipping plane Far clipping plane Enable or disable saving of camera frames. True = saving enabled The path name which will hold the frame data. If path name is relative, then directory is relative to current working directory. Depth camera parameters Type of output The near and far clip planes. Objects closer or farther than these planes are not detected by the depth camera. Near clipping plane for depth camera Far clipping plane for depth camera The properties of the noise model that should be applied to generated images The type of noise. Currently supported types are: "gaussian" (draw additive noise values independently for each pixel from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. Lens distortion to be applied to camera images. See http://en.wikipedia.org/wiki/Distortion_(optics)#Software_correction The radial distortion coefficient k1 The radial distortion coefficient k2 The radial distortion coefficient k3 The tangential distortion coefficient p1 The tangential distortion coefficient p2 The distortion center or principal point Lens projection description Type of the lens mapping. Supported values are gnomonical, stereographic, equidistant, equisolid_angle, orthographic, custom. For gnomonical (perspective) projection, it is recommended to specify a horizontal_fov of less than or equal to 90° If true the image will be scaled to fit horizontal FOV, otherwise it will be shown according to projection type parameters Definition of custom mapping function in a form of r=c1*f*fun(theta/c2 + c3). See https://en.wikipedia.org/wiki/Fisheye_lens#Mapping_function Linear scaling constant Angle scaling constant Angle offset constant Focal length of the optical system. Note: It's not a focal length of the lens in a common sense! This value is ignored if 'scale_to_fov' is set to true Possible values are 'sin', 'tan' and 'id' Everything outside of the specified angle will be hidden, 90° by default Resolution of the environment cube map used to draw the world Camera intrinsic parameters for setting a custom perspective projection matrix (cannot be used with WideAngleCamera since this class uses image stitching from 6 different cameras for achieving a wide field of view). The focal lengths can be computed using focal_length_in_pixels = (image_width_in_pixels * 0.5) / tan(field_of_view_in_degrees * 0.5 * PI/180) X focal length (in pixels, overrides horizontal_fov) Y focal length (in pixels, overrides horizontal_fov) X principal point (in pixels) Y principal point (in pixels) XY axis skew A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. These elements are specific to the contact sensor. name of the collision element within a link that acts as the contact sensor. Topic on which contact data is published. These elements are specific to the GPS sensor. Parameters related to GPS position measurement. Noise parameters for horizontal position measurement, in units of meters. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical position measurement, in units of meters. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to GPS position measurement. Noise parameters for horizontal velocity measurement, in units of meters/second. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical velocity measurement, in units of meters/second. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the IMU sensor. This string represents special hardcoded use cases that are commonly seen with typical robot IMU's: - CUSTOM: use Euler angle custom_rpy orientation specification. The orientation of the IMU's reference frame is defined by adding the custom_rpy rotation to the parent_frame. - NED: The IMU XYZ aligns with NED, where NED orientation relative to Gazebo world is defined by the SphericalCoordinates class. - ENU: The IMU XYZ aligns with ENU, where ENU orientation relative to Gazebo world is defined by the SphericalCoordinates class. - NWU: The IMU XYZ aligns with NWU, where NWU orientation relative to Gazebo world is defined by the SphericalCoordinates class. - GRAV_UP: where direction of gravity maps to IMU reference frame Z-axis with Z-axis pointing in the opposite direction of gravity. IMU reference frame X-axis direction is defined by grav_dir_x. Note if grav_dir_x is parallel to gravity direction, this configuration fails. Otherwise, IMU reference frame X-axis is defined by projection of grav_dir_x onto a plane normal to the gravity vector. IMU reference frame Y-axis is a vector orthogonal to both X and Z axis following the right hand rule. - GRAV_DOWN: where direction of gravity maps to IMU reference frame Z-axis with Z-axis pointing in the direction of gravity. IMU reference frame X-axis direction is defined by grav_dir_x. Note if grav_dir_x is parallel to gravity direction, this configuration fails. Otherwise, IMU reference frame X-axis is defined by projection of grav_dir_x onto a plane normal to the gravity vector. IMU reference frame Y-axis is a vector orthogonal to both X and Z axis following the right hand rule. This field and parent_frame are used when localization is set to CUSTOM. Orientation (fixed axis roll, pitch yaw) transform from parent_frame to this IMU's reference frame. Some common examples are: - IMU reports in its local frame on boot. IMU sensor frame is the reference frame. Example: parent_frame="", custom_rpy="0 0 0" - IMU reports in Gazebo world frame. Example sdf: parent_frame="world", custom_rpy="0 0 0" - IMU reports in NWU frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between North-West-Up and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="0 0 0" - IMU reports in NED frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between North-East-Down and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="M_PI 0 0" - IMU reports in ENU frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between East-North-Up and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="0 0 -0.5*M_PI" - IMU reports in ROS optical frame as described in http://www.ros.org/reps/rep-0103.html#suffix-frames, which is (z-forward, x-left to right when facing +z, y-top to bottom when facing +z). (default gazebo camera is +x:view direction, +y:left, +z:up). Example sdf: parent_frame="local", custom_rpy="-0.5*M_PI 0 -0.5*M_PI" Name of parent frame which the custom_rpy transform is defined relative to. It can be any valid fully scoped Gazebo Link name or the special reserved "world" frame. If left empty, use the sensor's own local frame. Used when localization is set to GRAV_UP or GRAV_DOWN, a projection of this vector into a plane that is orthogonal to the gravity vector defines the direction of the IMU reference frame's X-axis. grav_dir_x is defined in the coordinate frame as defined by the parent_frame element. Name of parent frame in which the grav_dir_x vector is defined. It can be any valid fully scoped Gazebo Link name or the special reserved "world" frame. If left empty, use the sensor's own local frame. Topic on which data is published. DEPRECATED. Use the topic element that is a child of the sensor element. These elements are specific to body-frame angular velocity, which is expressed in radians per second Angular velocity about the X axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Angular velocity about the Y axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Angular velocity about the Z axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to body-frame linear acceleration, which is expressed in meters per second squared Linear acceleration about the X axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Linear acceleration about the Y axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Linear acceleration about the Z axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the lidar sensor. The number of simulated lidar rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle The number of simulated lidar rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle specifies range properties of each simulated lidar The minimum distance for each lidar ray. The maximum distance for each lidar ray. Linear resolution of each lidar ray. The properties of the noise model that should be applied to generated scans The type of noise. Currently supported types are: "gaussian" (draw noise values independently for each beam from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. These elements are specific to logical camera sensors. A logical camera reports objects that fall within a frustum. Computation should be performed on the CPU. Near clipping distance of the view frustum Far clipping distance of the view frustum Aspect ratio of the near and far planes. This is the width divided by the height of the near or far planes. Horizontal field of view of the frustum, in radians. This is the angle between the frustum's vertex and the edges of the near or far plane. These elements are specific to a Magnetometer sensor. Parameters related to the body-frame X axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to the body-frame Y axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to the body-frame Z axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the ray (laser) sensor. The number of simulated rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle The number of simulated rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle specifies range properties of each simulated ray The minimum distance for each ray. The maximum distance for each ray. Linear resolution of each ray. The properties of the noise model that should be applied to generated scans The type of noise. Currently supported types are: "gaussian" (draw noise values independently for each beam from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. These elements are specific to the sonar sensor. The sonar collision shape. Currently supported geometries are: "cone" and "sphere". Minimum range Max range Radius of the sonar cone at max range. This parameter is only used if geometry is "cone". These elements are specific to a wireless transceiver. Service set identifier (network name) Specifies the frequency of transmission in MHz Only a frequency range is filtered. Here we set the lower bound (MHz). Only a frequency range is filtered. Here we set the upper bound (MHz). Specifies the antenna gain in dBi Specifies the transmission power in dBm Mininum received signal power in dBm These elements are specific to the force torque sensor. Frame in which to report the wrench values. Currently supported frames are: "parent" report the wrench expressed in the orientation of the parent link frame, "child" report the wrench expressed in the orientation of the child link frame, "sensor" report the wrench expressed in the orientation of the joint sensor frame. Note that for each option the point with respect to which the torque component of the wrench is expressed is the joint origin. Direction of the wrench measured by the sensor. The supported options are: "parent_to_child" if the measured wrench is the one applied by parent link on the child link, "child_to_parent" if the measured wrench is the one applied by the child link on the parent link. A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. The light element describes a light source. A unique name for the light. The light type: point, directional, spot. When true, the light will cast shadows. Diffuse light color Specular light color Light attenuation Range of the light The linear attenuation factor: 1 means attenuate evenly over the distance. The constant attenuation factor: 1.0 means never attenuate, 0.0 is complete attenutation. The quadratic attenuation factor: adds a curvature to the attenuation. Direction of the light, only applicable for spot and directional lights. Spot light parameters Angle covered by the bright inner cone Angle covered by the outer cone The rate of falloff between the inner and outer cones. 1.0 means a linear falloff, less means slower falloff, higher means faster falloff. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A list of names of deleted entities/ The name of a deleted entity. Model state Name of the model Joint angle Name of the joint Angle of an axis Index of the axis. A nested model state element Name of the model. Scale for the 3 dimensions of the model. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. Link state Name of the link Velocity of the link. The x, y, z components of the pose correspond to the linear velocity of the link, and the roll, pitch, yaw components correspond to the angular velocity of the link Acceleration of the link. The x, y, z components of the pose correspond to the linear acceleration of the link, and the roll, pitch, yaw components correspond to the angular acceleration of the link Force and torque applied to the link. The x, y, z components of the pose correspond to the force applied to the link, and the roll, pitch, yaw components correspond to the torque applied to the link Collision state Name of the collision A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. Light state Name of the light A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. The population element defines how and where a set of models will be automatically populated in Gazebo. A unique name for the population. This name must not match another population in the world. The number of models to place. Specifies the type of object distribution and its optional parameters. Define how the objects will be placed in the specified region. - random: Models placed at random. - uniform: Models approximately placed in a 2D grid pattern with control over the number of objects. - grid: Models evenly placed in a 2D grid pattern. The number of objects is not explicitly specified, it is based on the number of rows and columns of the grid. - linear-x: Models evently placed in a row along the global x-axis. - linear-y: Models evently placed in a row along the global y-axis. - linear-z: Models evently placed in a row along the global z-axis. Number of rows in the grid. Number of columns in the grid. Distance between elements of the grid. Box shape The three side lengths of the box. The origin of the box is in its geometric center (inside the center of the box). Cylinder shape Radius of the cylinder Length of the cylinder A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. The model element defines a complete robot or any other physical object. A unique name for the model. This name must not match another model in the world. If set to true, the model is immovable. Otherwise the model is simulated in the dynamics engine. If set to true, all links in the model will collide with each other (except those connected by a joint). Can be overridden by the link or collision element self_collide property. Two links within a model will collide if link1.self_collide OR link2.self_collide. Links connected by a joint will never collide. Allows a model to auto-disable, which is means the physics engine can skip updating the model when the model is at rest. This parameter is only used by models with no joints. Include resources from a URI. This can be used to nest models. URI to a resource, such as a model Override the pose of the included model. A position and orientation in the global coordinate frame for the model. Position(x,y,z) and rotation (roll, pitch yaw) in the global coordinate frame. Override the name of the included model. Override the static value of the included model. A nested model element A unique name for the model. This name must not match another nested model in the same level as this model. If set to true, all links in the model will be affected by the wind. Can be overriden by the link wind property. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A physical link with inertia, collision, and visual properties. A link must be a child of a model, and any number of links may exist in a model. A unique name for the link within the scope of the model. If true, the link is affected by gravity. If true, the link is affected by the wind. If true, the link can collide with other links in the model. Two links within a model will collide if link1.self_collide OR link2.self_collide. Links connected by a joint will never collide. If true, the link is kinematic only If true, the link will have 6DOF and be a direct child of world. Exponential damping of the link's velocity. Linear damping Angular damping A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. The inertial properties of the link. The mass of the link. The 3x3 rotational inertia matrix. Because the rotational inertia matrix is symmetric, only 6 above-diagonal elements of this matrix are specified here, using the attributes ixx, ixy, ixz, iyy, iyz, izz. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. This is the pose of the inertial reference frame, relative to the specified reference frame. The origin of the inertial reference frame needs to be at the center of gravity. The axes of the inertial reference frame do not need to be aligned with the principal axes of the inertia. Name of frame which the pose is defined relative to. The collision properties of a link. Note that this can be different from the visual properties of a link, for example, simpler collision models are often used to reduce computation time. Unique name for the collision element within the scope of the parent link. intensity value returned by laser sensor. Maximum number of contacts allowed between two entities. This value overrides the max_contacts element defined in physics. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. The shape of the visual or collision object. You can use the empty tag to make empty geometries. Box shape The three side lengths of the box. The origin of the box is in its geometric center (inside the center of the box). Cylinder shape Radius of the cylinder Length of the cylinder A heightmap based on a 2d grayscale image. URI to a grayscale image file The size of the heightmap in world units. When loading an image: "size" is used if present, otherwise defaults to 1x1x1. When loading a DEM: "size" is used if present, otherwise defaults to true size of DEM. A position offset. The heightmap can contain multiple textures. The order of the texture matters. The first texture will appear at the lowest height, and the last texture at the highest height. Use blend to control the height thresholds and fade between textures. Size of the applied texture in meters. Diffuse texture image filename Normalmap texture image filename The blend tag controls how two adjacent textures are mixed. The number of blend elements should equal one less than the number of textures. Min height of a blend layer Distance over which the blend occurs Set if the rendering engine will use terrain paging Samples per heightmap datum. For rasterized heightmaps, this indicates the number of samples to take per pixel. Using a lower value, e.g. 1, will generally improve the performance of the heightmap but lower the heightmap quality. Extrude a set of boxes from a grayscale image. URI of the grayscale image file Scaling factor applied to the image Grayscale threshold Height of the extruded boxes The amount of error in the model Mesh shape Mesh uri Use a named submesh. The submesh must exist in the mesh specified by the uri Name of the submesh within the parent mesh Set to true to center the vertices of the submesh at 0,0,0. This will effectively remove any transformations on the submesh before the poses from parent links and models are applied. Scaling factor applied to the mesh Plane shape Normal direction for the plane. When a Plane is used as a geometry for a Visual or Collision object, then the normal is specified in the Visual or Collision frame, respectively. Length of each side of the plane. Note that this property is meaningful only for visualizing the Plane, i.e., when the Plane is used as a geometry for a Visual object. The Plane has infinite size when used as a geometry for a Collision object. Defines an extruded polyline shape A series of points that define the path of the polyline. Height of the polyline Sphere shape radius of the sphere The surface parameters Bounciness coefficient of restitution, from [0...1], where 0=no bounciness. Bounce capture velocity, below which effective coefficient of restitution is 0. Parameters for torsional friction Torsional friction coefficient, unitless maximum ratio of tangential stress to normal stress. If this flag is true, torsional friction is calculated using the "patch_radius" parameter. If this flag is set to false, "surface_radius" (R) and contact depth (d) are used to compute the patch radius as sqrt(R*d). Radius of contact patch surface. Surface radius on the point of contact. Torsional friction parameters for ODE Force dependent slip for torsional friction, equivalent to inverse of viscous damping coefficient with units of rad/s/(Nm). A slip value of 0 is infinitely viscous. ODE friction parameters Coefficient of friction in first friction pyramid direction, the unitless maximum ratio of force in first friction pyramid direction to normal force. Coefficient of friction in second friction pyramid direction, the unitless maximum ratio of force in second friction pyramid direction to normal force. Unit vector specifying first friction pyramid direction in collision-fixed reference frame. If the friction pyramid model is in use, and this value is set to a unit vector for one of the colliding surfaces, the ODE Collide callback function will align the friction pyramid directions with a reference frame fixed to that collision surface. If both surfaces have this value set to a vector of zeros, the friction pyramid directions will be aligned with the world frame. If this value is set for both surfaces, the behavior is undefined. Force dependent slip in first friction pyramid direction, equivalent to inverse of viscous damping coefficient with units of m/s/N. A slip value of 0 is infinitely viscous. Force dependent slip in second friction pyramid direction, equivalent to inverse of viscous damping coefficient with units of m/s/N. A slip value of 0 is infinitely viscous. Coefficient of friction in first friction pyramid direction, the unitless maximum ratio of force in first friction pyramid direction to normal force. Coefficient of friction in second friction pyramid direction, the unitless maximum ratio of force in second friction pyramid direction to normal force. Unit vector specifying first friction pyramid direction in collision-fixed reference frame. If the friction pyramid model is in use, and this value is set to a unit vector for one of the colliding surfaces, the friction pyramid directions will be aligned with a reference frame fixed to that collision surface. If both surfaces have this value set to a vector of zeros, the friction pyramid directions will be aligned with the world frame. If this value is set for both surfaces, the behavior is undefined. Coefficient of rolling friction Flag to disable contact force generation, while still allowing collision checks and contact visualization to occur. Bitmask for collision filtering when collide_without_contact is on Bitmask for collision filtering. This will override collide_without_contact Bitmask for category of collision filtering. Collision happens if ((category1 & collision2) | (category2 & collision1)) is not zero. If not specified, the category_bitmask should be interpreted as being the same as collide_bitmask. Poisson's ratio is the unitless ratio between transverse and axial strain. This value must lie between (-1, 0.5). Defaults to 0.3 for typical steel. Note typical silicone elastomers have Poisson's ratio near 0.49 ~ 0.50. For reference, approximate values for Material:(Young's Modulus, Poisson's Ratio) for some of the typical materials are: Plastic: (1e8 ~ 3e9 Pa, 0.35 ~ 0.41), Wood: (4e9 ~ 1e10 Pa, 0.22 ~ 0.50), Aluminum: (7e10 Pa, 0.32 ~ 0.35), Steel: (2e11 Pa, 0.26 ~ 0.31). Young's Modulus in SI derived unit Pascal. Defaults to -1. If value is less or equal to zero, contact using elastic modulus (with Poisson's Ratio) is disabled. For reference, approximate values for Material:(Young's Modulus, Poisson's Ratio) for some of the typical materials are: Plastic: (1e8 ~ 3e9 Pa, 0.35 ~ 0.41), Wood: (4e9 ~ 1e10 Pa, 0.22 ~ 0.50), Aluminum: (7e10 Pa, 0.32 ~ 0.35), Steel: (2e11 Pa, 0.26 ~ 0.31). ODE contact parameters Soft constraint force mixing. Soft error reduction parameter dynamically "stiffness"-equivalent coefficient for contact joints dynamically "damping"-equivalent coefficient for contact joints maximum contact correction velocity truncation term. minimum allowable depth before contact correction impulse is applied Bullet contact parameters Soft constraint force mixing. Soft error reduction parameter dynamically "stiffness"-equivalent coefficient for contact joints dynamically "damping"-equivalent coefficient for contact joints Similar to ODE's max_vel implementation. See http://bulletphysics.org/mediawiki-1.5.8/index.php/BtContactSolverInfo#Split_Impulse for more information. Similar to ODE's max_vel implementation. See http://bulletphysics.org/mediawiki-1.5.8/index.php/BtContactSolverInfo#Split_Impulse for more information. soft contact pamameters based on paper: http://www.cc.gatech.edu/graphics/projects/Sumit/homepage/papers/sigasia11/jain_softcontacts_siga11.pdf This is variable k_v in the soft contacts paper. Its unit is N/m. This is variable k_e in the soft contacts paper. Its unit is N/m. Viscous damping of point velocity in body frame. Its unit is N/m/s. Fraction of mass to be distributed among deformable nodes. The visual properties of the link. This element specifies the shape of the object (box, cylinder, etc.) for visualization purposes. Unique name for the visual element within the scope of the parent link. If true the visual will cast shadows. will be implemented in the future release. The amount of transparency( 0=opaque, 1 = fully transparent) Optional meta information for the visual. The information contained within this element should be used to provide additional feedback to an end user. The layer in which this visual is displayed. The layer number is useful for programs, such as Gazebo, that put visuals in different layers for enhanced visualization. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. The material of the visual element. Name of material from an installed script file. This will override the color element if the script exists. URI of the material script file Name of the script within the script file vertex, pixel, normal_map_objectspace, normal_map_tangentspace filename of the normal map If false, dynamic lighting will be disabled The ambient color of a material specified by set of four numbers representing red/green/blue, each in the range of [0,1]. The diffuse color of a material specified by set of four numbers representing red/green/blue/alpha, each in the range of [0,1]. The specular color of a material specified by set of four numbers representing red/green/blue/alpha, each in the range of [0,1]. The emissive color of a material specified by set of four numbers representing red/green/blue, each in the range of [0,1]. Physically Based Rendering (PBR) material. There are two PBR workflows: metal and specular. While both workflows and their parameters can be specified at the same time, typically only one of them will be used (depending on the underlying renderer capability). It is also recommended to use the same workflow for all materials in the world. PBR using the Metallic/Roughness workflow. Filename of the diffuse/albedo map. Filename of the roughness map. Material roughness in the range of [0,1], where 0 represents a smooth surface and 1 represents a rough surface. This is the inverse of a specular map in a PBR specular workflow. Filename of the metalness map. Material metalness in the range of [0,1], where 0 represents non-metal and 1 represents raw metal Filename of the environment / reflection map, typically in the form of a cubemap Filename of the ambient occlusion map. The map defines the amount of ambient lighting on the surface. Filename of the normal map. The normals can be in the object space or tangent space as specified in the 'type' attribute The space that the normals are in. Values are: 'object' or 'tangent' Filename of the emissive map. PBR using the Specular/Glossiness workflow. Filename of the diffuse/albedo map. Filename of the specular map. Filename of the glossiness map. Material glossiness in the range of [0-1], where 0 represents a rough surface and 1 represents a smooth surface. This is the inverse of a roughness map in a PBR metal workflow. Filename of the ambient occlusion map. The map defines the amount of ambient lighting on the surface. Filename of the normal map. The normals can be in the object space or tangent space as specified in the 'type' attribute The space that the normals are in. Values are: 'object' or 'tangent' Filename of the emissive map. The shape of the visual or collision object. You can use the empty tag to make empty geometries. Box shape The three side lengths of the box. The origin of the box is in its geometric center (inside the center of the box). Cylinder shape Radius of the cylinder Length of the cylinder A heightmap based on a 2d grayscale image. URI to a grayscale image file The size of the heightmap in world units. When loading an image: "size" is used if present, otherwise defaults to 1x1x1. When loading a DEM: "size" is used if present, otherwise defaults to true size of DEM. A position offset. The heightmap can contain multiple textures. The order of the texture matters. The first texture will appear at the lowest height, and the last texture at the highest height. Use blend to control the height thresholds and fade between textures. Size of the applied texture in meters. Diffuse texture image filename Normalmap texture image filename The blend tag controls how two adjacent textures are mixed. The number of blend elements should equal one less than the number of textures. Min height of a blend layer Distance over which the blend occurs Set if the rendering engine will use terrain paging Samples per heightmap datum. For rasterized heightmaps, this indicates the number of samples to take per pixel. Using a lower value, e.g. 1, will generally improve the performance of the heightmap but lower the heightmap quality. Extrude a set of boxes from a grayscale image. URI of the grayscale image file Scaling factor applied to the image Grayscale threshold Height of the extruded boxes The amount of error in the model Mesh shape Mesh uri Use a named submesh. The submesh must exist in the mesh specified by the uri Name of the submesh within the parent mesh Set to true to center the vertices of the submesh at 0,0,0. This will effectively remove any transformations on the submesh before the poses from parent links and models are applied. Scaling factor applied to the mesh Plane shape Normal direction for the plane. When a Plane is used as a geometry for a Visual or Collision object, then the normal is specified in the Visual or Collision frame, respectively. Length of each side of the plane. Note that this property is meaningful only for visualizing the Plane, i.e., when the Plane is used as a geometry for a Visual object. The Plane has infinite size when used as a geometry for a Collision object. Defines an extruded polyline shape A series of points that define the path of the polyline. Height of the polyline Sphere shape radius of the sphere A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. The sensor tag describes the type and properties of a sensor. A unique name for the sensor. This name must not match another model in the model. The type name of the sensor. By default, SDFormat supports types air_pressure, altimeter, camera, contact, depth_camera, force_torque, gps, gpu_lidar, gpu_ray, imu, lidar, logical_camera, magnetometer, multicamera, ray, rfid, rfidtag, rgbd_camera, sonar, thermal_camera, wireless_receiver, and wireless_transmitter. The "ray" and "gpu_ray" types are equivalent to "lidar" and "gpu_lidar", respectively. It is preferred to use "lidar" and "gpu_lidar" since "ray" and "gpu_ray" will be deprecated. The "ray" and "gpu_ray" types are maintained for legacy support. If true the sensor will always be updated according to the update rate. The frequency at which the sensor data is generated. If left unspecified, the sensor will generate data every cycle. If true, the sensor is visualized in the GUI Name of the topic on which data is published. This is necessary for visualization A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. These elements are specific to an air pressure sensor. The initial altitude in meters. This value can be used by a sensor implementation to augment the altitude of the sensor. For example, if you are using simulation instead of creating a 1000 m mountain model on which to place your sensor, you could instead set this value to 1000 and place your model on a ground plane with a Z height of zero. Noise parameters for the pressure data. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to an altimeter sensor. Noise parameters for vertical position The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical velocity The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to camera sensors. An optional name for the camera. Horizontal field of view The image size in pixels and format. Width in pixels Height in pixels (L8|R8G8B8|B8G8R8|BAYER_RGGB8|BAYER_BGGR8|BAYER_GBRG8|BAYER_GRBG8) The near and far clip planes. Objects closer or farther than these planes are not rendered. Near clipping plane Far clipping plane Enable or disable saving of camera frames. True = saving enabled The path name which will hold the frame data. If path name is relative, then directory is relative to current working directory. Depth camera parameters Type of output The near and far clip planes. Objects closer or farther than these planes are not detected by the depth camera. Near clipping plane for depth camera Far clipping plane for depth camera The properties of the noise model that should be applied to generated images The type of noise. Currently supported types are: "gaussian" (draw additive noise values independently for each pixel from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. Lens distortion to be applied to camera images. See http://en.wikipedia.org/wiki/Distortion_(optics)#Software_correction The radial distortion coefficient k1 The radial distortion coefficient k2 The radial distortion coefficient k3 The tangential distortion coefficient p1 The tangential distortion coefficient p2 The distortion center or principal point Lens projection description Type of the lens mapping. Supported values are gnomonical, stereographic, equidistant, equisolid_angle, orthographic, custom. For gnomonical (perspective) projection, it is recommended to specify a horizontal_fov of less than or equal to 90° If true the image will be scaled to fit horizontal FOV, otherwise it will be shown according to projection type parameters Definition of custom mapping function in a form of r=c1*f*fun(theta/c2 + c3). See https://en.wikipedia.org/wiki/Fisheye_lens#Mapping_function Linear scaling constant Angle scaling constant Angle offset constant Focal length of the optical system. Note: It's not a focal length of the lens in a common sense! This value is ignored if 'scale_to_fov' is set to true Possible values are 'sin', 'tan' and 'id' Everything outside of the specified angle will be hidden, 90° by default Resolution of the environment cube map used to draw the world Camera intrinsic parameters for setting a custom perspective projection matrix (cannot be used with WideAngleCamera since this class uses image stitching from 6 different cameras for achieving a wide field of view). The focal lengths can be computed using focal_length_in_pixels = (image_width_in_pixels * 0.5) / tan(field_of_view_in_degrees * 0.5 * PI/180) X focal length (in pixels, overrides horizontal_fov) Y focal length (in pixels, overrides horizontal_fov) X principal point (in pixels) Y principal point (in pixels) XY axis skew A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. These elements are specific to the contact sensor. name of the collision element within a link that acts as the contact sensor. Topic on which contact data is published. These elements are specific to the GPS sensor. Parameters related to GPS position measurement. Noise parameters for horizontal position measurement, in units of meters. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical position measurement, in units of meters. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to GPS position measurement. Noise parameters for horizontal velocity measurement, in units of meters/second. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical velocity measurement, in units of meters/second. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the IMU sensor. This string represents special hardcoded use cases that are commonly seen with typical robot IMU's: - CUSTOM: use Euler angle custom_rpy orientation specification. The orientation of the IMU's reference frame is defined by adding the custom_rpy rotation to the parent_frame. - NED: The IMU XYZ aligns with NED, where NED orientation relative to Gazebo world is defined by the SphericalCoordinates class. - ENU: The IMU XYZ aligns with ENU, where ENU orientation relative to Gazebo world is defined by the SphericalCoordinates class. - NWU: The IMU XYZ aligns with NWU, where NWU orientation relative to Gazebo world is defined by the SphericalCoordinates class. - GRAV_UP: where direction of gravity maps to IMU reference frame Z-axis with Z-axis pointing in the opposite direction of gravity. IMU reference frame X-axis direction is defined by grav_dir_x. Note if grav_dir_x is parallel to gravity direction, this configuration fails. Otherwise, IMU reference frame X-axis is defined by projection of grav_dir_x onto a plane normal to the gravity vector. IMU reference frame Y-axis is a vector orthogonal to both X and Z axis following the right hand rule. - GRAV_DOWN: where direction of gravity maps to IMU reference frame Z-axis with Z-axis pointing in the direction of gravity. IMU reference frame X-axis direction is defined by grav_dir_x. Note if grav_dir_x is parallel to gravity direction, this configuration fails. Otherwise, IMU reference frame X-axis is defined by projection of grav_dir_x onto a plane normal to the gravity vector. IMU reference frame Y-axis is a vector orthogonal to both X and Z axis following the right hand rule. This field and parent_frame are used when localization is set to CUSTOM. Orientation (fixed axis roll, pitch yaw) transform from parent_frame to this IMU's reference frame. Some common examples are: - IMU reports in its local frame on boot. IMU sensor frame is the reference frame. Example: parent_frame="", custom_rpy="0 0 0" - IMU reports in Gazebo world frame. Example sdf: parent_frame="world", custom_rpy="0 0 0" - IMU reports in NWU frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between North-West-Up and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="0 0 0" - IMU reports in NED frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between North-East-Down and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="M_PI 0 0" - IMU reports in ENU frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between East-North-Up and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="0 0 -0.5*M_PI" - IMU reports in ROS optical frame as described in http://www.ros.org/reps/rep-0103.html#suffix-frames, which is (z-forward, x-left to right when facing +z, y-top to bottom when facing +z). (default gazebo camera is +x:view direction, +y:left, +z:up). Example sdf: parent_frame="local", custom_rpy="-0.5*M_PI 0 -0.5*M_PI" Name of parent frame which the custom_rpy transform is defined relative to. It can be any valid fully scoped Gazebo Link name or the special reserved "world" frame. If left empty, use the sensor's own local frame. Used when localization is set to GRAV_UP or GRAV_DOWN, a projection of this vector into a plane that is orthogonal to the gravity vector defines the direction of the IMU reference frame's X-axis. grav_dir_x is defined in the coordinate frame as defined by the parent_frame element. Name of parent frame in which the grav_dir_x vector is defined. It can be any valid fully scoped Gazebo Link name or the special reserved "world" frame. If left empty, use the sensor's own local frame. Topic on which data is published. DEPRECATED. Use the topic element that is a child of the sensor element. These elements are specific to body-frame angular velocity, which is expressed in radians per second Angular velocity about the X axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Angular velocity about the Y axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Angular velocity about the Z axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to body-frame linear acceleration, which is expressed in meters per second squared Linear acceleration about the X axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Linear acceleration about the Y axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Linear acceleration about the Z axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the lidar sensor. The number of simulated lidar rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle The number of simulated lidar rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle specifies range properties of each simulated lidar The minimum distance for each lidar ray. The maximum distance for each lidar ray. Linear resolution of each lidar ray. The properties of the noise model that should be applied to generated scans The type of noise. Currently supported types are: "gaussian" (draw noise values independently for each beam from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. These elements are specific to logical camera sensors. A logical camera reports objects that fall within a frustum. Computation should be performed on the CPU. Near clipping distance of the view frustum Far clipping distance of the view frustum Aspect ratio of the near and far planes. This is the width divided by the height of the near or far planes. Horizontal field of view of the frustum, in radians. This is the angle between the frustum's vertex and the edges of the near or far plane. These elements are specific to a Magnetometer sensor. Parameters related to the body-frame X axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to the body-frame Y axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to the body-frame Z axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the ray (laser) sensor. The number of simulated rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle The number of simulated rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle specifies range properties of each simulated ray The minimum distance for each ray. The maximum distance for each ray. Linear resolution of each ray. The properties of the noise model that should be applied to generated scans The type of noise. Currently supported types are: "gaussian" (draw noise values independently for each beam from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. These elements are specific to the sonar sensor. The sonar collision shape. Currently supported geometries are: "cone" and "sphere". Minimum range Max range Radius of the sonar cone at max range. This parameter is only used if geometry is "cone". These elements are specific to a wireless transceiver. Service set identifier (network name) Specifies the frequency of transmission in MHz Only a frequency range is filtered. Here we set the lower bound (MHz). Only a frequency range is filtered. Here we set the upper bound (MHz). Specifies the antenna gain in dBi Specifies the transmission power in dBm Mininum received signal power in dBm These elements are specific to the force torque sensor. Frame in which to report the wrench values. Currently supported frames are: "parent" report the wrench expressed in the orientation of the parent link frame, "child" report the wrench expressed in the orientation of the child link frame, "sensor" report the wrench expressed in the orientation of the joint sensor frame. Note that for each option the point with respect to which the torque component of the wrench is expressed is the joint origin. Direction of the wrench measured by the sensor. The supported options are: "parent_to_child" if the measured wrench is the one applied by parent link on the child link, "child_to_parent" if the measured wrench is the one applied by the child link on the parent link. Name of the projector Texture name Field of view Near clip distance far clip distance A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. An audio sink. An audio source. URI of the audio media. Pitch for the audio media, in Hz Gain for the audio media, in dB. List of collision objects that will trigger audio playback. Name of child collision element that will trigger audio playback. True to make the audio source loop playback. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. Description of a battery. Unique name for the battery. Initial voltage in volts. The light element describes a light source. A unique name for the light. The light type: point, directional, spot. When true, the light will cast shadows. Diffuse light color Specular light color Light attenuation Range of the light The linear attenuation factor: 1 means attenuate evenly over the distance. The constant attenuation factor: 1.0 means never attenuate, 0.0 is complete attenutation. The quadratic attenuation factor: adds a curvature to the attenuation. Direction of the light, only applicable for spot and directional lights. Spot light parameters Angle covered by the bright inner cone Angle covered by the outer cone The rate of falloff between the inner and outer cones. 1.0 means a linear falloff, less means slower falloff, higher means faster falloff. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A joint connects two links with kinematic and dynamic properties. By default, the pose of a joint is expressed in the child link frame. A unique name for the joint within the scope of the model. The type of joint, which must be one of the following: (continuous) a hinge joint that rotates on a single axis with a continuous range of motion, (revolute) a hinge joint that rotates on a single axis with a fixed range of motion, (gearbox) geared revolute joints, (revolute2) same as two revolute joints connected in series, (prismatic) a sliding joint that slides along an axis with a limited range specified by upper and lower limits, (ball) a ball and socket joint, (screw) a single degree of freedom joint with coupled sliding and rotational motion, (universal) like a ball joint, but constrains one degree of freedom, (fixed) a joint with zero degrees of freedom that rigidly connects two links. Name of the parent link Name of the child link Parameter for gearbox joints. Given theta_1 and theta_2 defined in description for gearbox_reference_body, theta_2 = -gearbox_ratio * theta_1. Parameter for gearbox joints. Gearbox ratio is enforced over two joint angles. First joint angle (theta_1) is the angle from the gearbox_reference_body to the parent link in the direction of the axis element and the second joint angle (theta_2) is the angle from the gearbox_reference_body to the child link in the direction of the axis2 element. Parameter for screw joints. Parameters related to the axis of rotation for revolute joints, the axis of translation for prismatic joints. Default joint position for this joint axis. Represents the x,y,z components of the axis unit vector. The axis is expressed in the joint frame unless the use_parent_model_frame flag is set to true. The vector should be normalized. Flag to interpret the axis xyz element in the parent model frame instead of joint frame. Provided for Gazebo compatibility (see https://bitbucket.org/osrf/gazebo/issue/494 ). An element specifying physical properties of the joint. These values are used to specify modeling properties of the joint, particularly useful for simulation. The physical velocity dependent viscous damping coefficient of the joint. The physical static friction value of the joint. The spring reference position for this joint axis. The spring stiffness for this joint axis. specifies the limits of this joint Specifies the lower joint limit (radians for revolute joints, meters for prismatic joints). Omit if joint is continuous. Specifies the upper joint limit (radians for revolute joints, meters for prismatic joints). Omit if joint is continuous. A value for enforcing the maximum joint effort applied. Limit is not enforced if value is negative. A value for enforcing the maximum joint velocity. Joint stop stiffness. Joint stop dissipation. Parameters related to the second axis of rotation for revolute2 joints and universal joints. Default joint position for this joint axis. Represents the x,y,z components of the axis unit vector. The axis is expressed in the joint frame unless the use_parent_model_frame flag is set to true. The vector should be normalized. Flag to interpret the axis xyz element in the parent model frame instead of joint frame. Provided for Gazebo compatibility (see https://bitbucket.org/osrf/gazebo/issue/494 ). An element specifying physical properties of the joint. These values are used to specify modeling properties of the joint, particularly useful for simulation. The physical velocity dependent viscous damping coefficient of the joint. EXPERIMENTAL: if damping coefficient is negative and implicit_spring_damper is true, adaptive damping is used. The physical static friction value of the joint. The spring reference position for this joint axis. The spring stiffness for this joint axis. An attribute specifying the lower joint limit (radians for revolute joints, meters for prismatic joints). Omit if joint is continuous. An attribute specifying the upper joint limit (radians for revolute joints, meters for prismatic joints). Omit if joint is continuous. An attribute for enforcing the maximum joint effort applied by Joint::SetForce. Limit is not enforced if value is negative. (not implemented) An attribute for enforcing the maximum joint velocity. Joint stop stiffness. Supported physics engines: SimBody. Joint stop dissipation. Supported physics engines: SimBody. Parameters that are specific to a certain physics engine. Simbody specific parameters Force cut in the multibody graph at this joint. ODE specific parameters (DEPRECATION WARNING: In SDFormat 1.5 this tag will be replaced by the same tag directly under the physics-block. For now, this tag overrides the one outside of ode-block, but in SDFormat 1.5 this tag will be removed completely.) If provide feedback is set to true, ODE will compute the constraint forces at this joint. If cfm damping is set to true, ODE will use CFM to simulate damping, allows for infinite damping, and one additional constraint row (previously used for joint limit) is always active. If implicit_spring_damper is set to true, ODE will use CFM, ERP to simulate stiffness and damping, allows for infinite damping, and one additional constraint row (previously used for joint limit) is always active. This replaces cfm_damping parameter in SDFormat 1.4. Scale the excess for in a joint motor at joint limits. Should be between zero and one. Constraint force mixing for constrained directions Error reduction parameter for constrained directions Bounciness of the limits Maximum force or torque used to reach the desired velocity. The desired velocity of the joint. Should only be set if you want the joint to move on load. Constraint force mixing parameter used by the joint stop Error reduction parameter used by the joint stop Suspension constraint force mixing parameter Suspension error reduction parameter If provide feedback is set to true, physics engine will compute the constraint forces at this joint. For now, provide_feedback under ode block will override this tag and given user warning about the migration. provide_feedback under ode is scheduled to be removed in SDFormat 1.5. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. The sensor tag describes the type and properties of a sensor. A unique name for the sensor. This name must not match another model in the model. The type name of the sensor. By default, SDFormat supports types air_pressure, altimeter, camera, contact, depth_camera, force_torque, gps, gpu_lidar, gpu_ray, imu, lidar, logical_camera, magnetometer, multicamera, ray, rfid, rfidtag, rgbd_camera, sonar, thermal_camera, wireless_receiver, and wireless_transmitter. The "ray" and "gpu_ray" types are equivalent to "lidar" and "gpu_lidar", respectively. It is preferred to use "lidar" and "gpu_lidar" since "ray" and "gpu_ray" will be deprecated. The "ray" and "gpu_ray" types are maintained for legacy support. If true the sensor will always be updated according to the update rate. The frequency at which the sensor data is generated. If left unspecified, the sensor will generate data every cycle. If true, the sensor is visualized in the GUI Name of the topic on which data is published. This is necessary for visualization A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. These elements are specific to an air pressure sensor. The initial altitude in meters. This value can be used by a sensor implementation to augment the altitude of the sensor. For example, if you are using simulation instead of creating a 1000 m mountain model on which to place your sensor, you could instead set this value to 1000 and place your model on a ground plane with a Z height of zero. Noise parameters for the pressure data. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to an altimeter sensor. Noise parameters for vertical position The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical velocity The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to camera sensors. An optional name for the camera. Horizontal field of view The image size in pixels and format. Width in pixels Height in pixels (L8|R8G8B8|B8G8R8|BAYER_RGGB8|BAYER_BGGR8|BAYER_GBRG8|BAYER_GRBG8) The near and far clip planes. Objects closer or farther than these planes are not rendered. Near clipping plane Far clipping plane Enable or disable saving of camera frames. True = saving enabled The path name which will hold the frame data. If path name is relative, then directory is relative to current working directory. Depth camera parameters Type of output The near and far clip planes. Objects closer or farther than these planes are not detected by the depth camera. Near clipping plane for depth camera Far clipping plane for depth camera The properties of the noise model that should be applied to generated images The type of noise. Currently supported types are: "gaussian" (draw additive noise values independently for each pixel from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. Lens distortion to be applied to camera images. See http://en.wikipedia.org/wiki/Distortion_(optics)#Software_correction The radial distortion coefficient k1 The radial distortion coefficient k2 The radial distortion coefficient k3 The tangential distortion coefficient p1 The tangential distortion coefficient p2 The distortion center or principal point Lens projection description Type of the lens mapping. Supported values are gnomonical, stereographic, equidistant, equisolid_angle, orthographic, custom. For gnomonical (perspective) projection, it is recommended to specify a horizontal_fov of less than or equal to 90° If true the image will be scaled to fit horizontal FOV, otherwise it will be shown according to projection type parameters Definition of custom mapping function in a form of r=c1*f*fun(theta/c2 + c3). See https://en.wikipedia.org/wiki/Fisheye_lens#Mapping_function Linear scaling constant Angle scaling constant Angle offset constant Focal length of the optical system. Note: It's not a focal length of the lens in a common sense! This value is ignored if 'scale_to_fov' is set to true Possible values are 'sin', 'tan' and 'id' Everything outside of the specified angle will be hidden, 90° by default Resolution of the environment cube map used to draw the world Camera intrinsic parameters for setting a custom perspective projection matrix (cannot be used with WideAngleCamera since this class uses image stitching from 6 different cameras for achieving a wide field of view). The focal lengths can be computed using focal_length_in_pixels = (image_width_in_pixels * 0.5) / tan(field_of_view_in_degrees * 0.5 * PI/180) X focal length (in pixels, overrides horizontal_fov) Y focal length (in pixels, overrides horizontal_fov) X principal point (in pixels) Y principal point (in pixels) XY axis skew A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. These elements are specific to the contact sensor. name of the collision element within a link that acts as the contact sensor. Topic on which contact data is published. These elements are specific to the GPS sensor. Parameters related to GPS position measurement. Noise parameters for horizontal position measurement, in units of meters. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical position measurement, in units of meters. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to GPS position measurement. Noise parameters for horizontal velocity measurement, in units of meters/second. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical velocity measurement, in units of meters/second. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the IMU sensor. This string represents special hardcoded use cases that are commonly seen with typical robot IMU's: - CUSTOM: use Euler angle custom_rpy orientation specification. The orientation of the IMU's reference frame is defined by adding the custom_rpy rotation to the parent_frame. - NED: The IMU XYZ aligns with NED, where NED orientation relative to Gazebo world is defined by the SphericalCoordinates class. - ENU: The IMU XYZ aligns with ENU, where ENU orientation relative to Gazebo world is defined by the SphericalCoordinates class. - NWU: The IMU XYZ aligns with NWU, where NWU orientation relative to Gazebo world is defined by the SphericalCoordinates class. - GRAV_UP: where direction of gravity maps to IMU reference frame Z-axis with Z-axis pointing in the opposite direction of gravity. IMU reference frame X-axis direction is defined by grav_dir_x. Note if grav_dir_x is parallel to gravity direction, this configuration fails. Otherwise, IMU reference frame X-axis is defined by projection of grav_dir_x onto a plane normal to the gravity vector. IMU reference frame Y-axis is a vector orthogonal to both X and Z axis following the right hand rule. - GRAV_DOWN: where direction of gravity maps to IMU reference frame Z-axis with Z-axis pointing in the direction of gravity. IMU reference frame X-axis direction is defined by grav_dir_x. Note if grav_dir_x is parallel to gravity direction, this configuration fails. Otherwise, IMU reference frame X-axis is defined by projection of grav_dir_x onto a plane normal to the gravity vector. IMU reference frame Y-axis is a vector orthogonal to both X and Z axis following the right hand rule. This field and parent_frame are used when localization is set to CUSTOM. Orientation (fixed axis roll, pitch yaw) transform from parent_frame to this IMU's reference frame. Some common examples are: - IMU reports in its local frame on boot. IMU sensor frame is the reference frame. Example: parent_frame="", custom_rpy="0 0 0" - IMU reports in Gazebo world frame. Example sdf: parent_frame="world", custom_rpy="0 0 0" - IMU reports in NWU frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between North-West-Up and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="0 0 0" - IMU reports in NED frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between North-East-Down and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="M_PI 0 0" - IMU reports in ENU frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between East-North-Up and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="0 0 -0.5*M_PI" - IMU reports in ROS optical frame as described in http://www.ros.org/reps/rep-0103.html#suffix-frames, which is (z-forward, x-left to right when facing +z, y-top to bottom when facing +z). (default gazebo camera is +x:view direction, +y:left, +z:up). Example sdf: parent_frame="local", custom_rpy="-0.5*M_PI 0 -0.5*M_PI" Name of parent frame which the custom_rpy transform is defined relative to. It can be any valid fully scoped Gazebo Link name or the special reserved "world" frame. If left empty, use the sensor's own local frame. Used when localization is set to GRAV_UP or GRAV_DOWN, a projection of this vector into a plane that is orthogonal to the gravity vector defines the direction of the IMU reference frame's X-axis. grav_dir_x is defined in the coordinate frame as defined by the parent_frame element. Name of parent frame in which the grav_dir_x vector is defined. It can be any valid fully scoped Gazebo Link name or the special reserved "world" frame. If left empty, use the sensor's own local frame. Topic on which data is published. DEPRECATED. Use the topic element that is a child of the sensor element. These elements are specific to body-frame angular velocity, which is expressed in radians per second Angular velocity about the X axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Angular velocity about the Y axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Angular velocity about the Z axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to body-frame linear acceleration, which is expressed in meters per second squared Linear acceleration about the X axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Linear acceleration about the Y axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Linear acceleration about the Z axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the lidar sensor. The number of simulated lidar rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle The number of simulated lidar rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle specifies range properties of each simulated lidar The minimum distance for each lidar ray. The maximum distance for each lidar ray. Linear resolution of each lidar ray. The properties of the noise model that should be applied to generated scans The type of noise. Currently supported types are: "gaussian" (draw noise values independently for each beam from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. These elements are specific to logical camera sensors. A logical camera reports objects that fall within a frustum. Computation should be performed on the CPU. Near clipping distance of the view frustum Far clipping distance of the view frustum Aspect ratio of the near and far planes. This is the width divided by the height of the near or far planes. Horizontal field of view of the frustum, in radians. This is the angle between the frustum's vertex and the edges of the near or far plane. These elements are specific to a Magnetometer sensor. Parameters related to the body-frame X axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to the body-frame Y axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to the body-frame Z axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the ray (laser) sensor. The number of simulated rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle The number of simulated rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle specifies range properties of each simulated ray The minimum distance for each ray. The maximum distance for each ray. Linear resolution of each ray. The properties of the noise model that should be applied to generated scans The type of noise. Currently supported types are: "gaussian" (draw noise values independently for each beam from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. These elements are specific to the sonar sensor. The sonar collision shape. Currently supported geometries are: "cone" and "sphere". Minimum range Max range Radius of the sonar cone at max range. This parameter is only used if geometry is "cone". These elements are specific to a wireless transceiver. Service set identifier (network name) Specifies the frequency of transmission in MHz Only a frequency range is filtered. Here we set the lower bound (MHz). Only a frequency range is filtered. Here we set the upper bound (MHz). Specifies the antenna gain in dBi Specifies the transmission power in dBm Mininum received signal power in dBm These elements are specific to the force torque sensor. Frame in which to report the wrench values. Currently supported frames are: "parent" report the wrench expressed in the orientation of the parent link frame, "child" report the wrench expressed in the orientation of the child link frame, "sensor" report the wrench expressed in the orientation of the joint sensor frame. Note that for each option the point with respect to which the torque component of the wrench is expressed is the joint origin. Direction of the wrench measured by the sensor. The supported options are: "parent_to_child" if the measured wrench is the one applied by parent link on the child link, "child_to_parent" if the measured wrench is the one applied by the child link on the parent link. A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. The model element defines a complete robot or any other physical object. A unique name for the model. This name must not match another model in the world. If set to true, the model is immovable. Otherwise the model is simulated in the dynamics engine. If set to true, all links in the model will collide with each other (except those connected by a joint). Can be overridden by the link or collision element self_collide property. Two links within a model will collide if link1.self_collide OR link2.self_collide. Links connected by a joint will never collide. Allows a model to auto-disable, which is means the physics engine can skip updating the model when the model is at rest. This parameter is only used by models with no joints. Include resources from a URI. This can be used to nest models. URI to a resource, such as a model Override the pose of the included model. A position and orientation in the global coordinate frame for the model. Position(x,y,z) and rotation (roll, pitch yaw) in the global coordinate frame. Override the name of the included model. Override the static value of the included model. A nested model element A unique name for the model. This name must not match another nested model in the same level as this model. If set to true, all links in the model will be affected by the wind. Can be overriden by the link wind property. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A physical link with inertia, collision, and visual properties. A link must be a child of a model, and any number of links may exist in a model. A unique name for the link within the scope of the model. If true, the link is affected by gravity. If true, the link is affected by the wind. If true, the link can collide with other links in the model. Two links within a model will collide if link1.self_collide OR link2.self_collide. Links connected by a joint will never collide. If true, the link is kinematic only If true, the link will have 6DOF and be a direct child of world. Exponential damping of the link's velocity. Linear damping Angular damping A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. The inertial properties of the link. The mass of the link. The 3x3 rotational inertia matrix. Because the rotational inertia matrix is symmetric, only 6 above-diagonal elements of this matrix are specified here, using the attributes ixx, ixy, ixz, iyy, iyz, izz. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. This is the pose of the inertial reference frame, relative to the specified reference frame. The origin of the inertial reference frame needs to be at the center of gravity. The axes of the inertial reference frame do not need to be aligned with the principal axes of the inertia. Name of frame which the pose is defined relative to. The collision properties of a link. Note that this can be different from the visual properties of a link, for example, simpler collision models are often used to reduce computation time. Unique name for the collision element within the scope of the parent link. intensity value returned by laser sensor. Maximum number of contacts allowed between two entities. This value overrides the max_contacts element defined in physics. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. The shape of the visual or collision object. You can use the empty tag to make empty geometries. Box shape The three side lengths of the box. The origin of the box is in its geometric center (inside the center of the box). Cylinder shape Radius of the cylinder Length of the cylinder A heightmap based on a 2d grayscale image. URI to a grayscale image file The size of the heightmap in world units. When loading an image: "size" is used if present, otherwise defaults to 1x1x1. When loading a DEM: "size" is used if present, otherwise defaults to true size of DEM. A position offset. The heightmap can contain multiple textures. The order of the texture matters. The first texture will appear at the lowest height, and the last texture at the highest height. Use blend to control the height thresholds and fade between textures. Size of the applied texture in meters. Diffuse texture image filename Normalmap texture image filename The blend tag controls how two adjacent textures are mixed. The number of blend elements should equal one less than the number of textures. Min height of a blend layer Distance over which the blend occurs Set if the rendering engine will use terrain paging Samples per heightmap datum. For rasterized heightmaps, this indicates the number of samples to take per pixel. Using a lower value, e.g. 1, will generally improve the performance of the heightmap but lower the heightmap quality. Extrude a set of boxes from a grayscale image. URI of the grayscale image file Scaling factor applied to the image Grayscale threshold Height of the extruded boxes The amount of error in the model Mesh shape Mesh uri Use a named submesh. The submesh must exist in the mesh specified by the uri Name of the submesh within the parent mesh Set to true to center the vertices of the submesh at 0,0,0. This will effectively remove any transformations on the submesh before the poses from parent links and models are applied. Scaling factor applied to the mesh Plane shape Normal direction for the plane. When a Plane is used as a geometry for a Visual or Collision object, then the normal is specified in the Visual or Collision frame, respectively. Length of each side of the plane. Note that this property is meaningful only for visualizing the Plane, i.e., when the Plane is used as a geometry for a Visual object. The Plane has infinite size when used as a geometry for a Collision object. Defines an extruded polyline shape A series of points that define the path of the polyline. Height of the polyline Sphere shape radius of the sphere The surface parameters Bounciness coefficient of restitution, from [0...1], where 0=no bounciness. Bounce capture velocity, below which effective coefficient of restitution is 0. Parameters for torsional friction Torsional friction coefficient, unitless maximum ratio of tangential stress to normal stress. If this flag is true, torsional friction is calculated using the "patch_radius" parameter. If this flag is set to false, "surface_radius" (R) and contact depth (d) are used to compute the patch radius as sqrt(R*d). Radius of contact patch surface. Surface radius on the point of contact. Torsional friction parameters for ODE Force dependent slip for torsional friction, equivalent to inverse of viscous damping coefficient with units of rad/s/(Nm). A slip value of 0 is infinitely viscous. ODE friction parameters Coefficient of friction in first friction pyramid direction, the unitless maximum ratio of force in first friction pyramid direction to normal force. Coefficient of friction in second friction pyramid direction, the unitless maximum ratio of force in second friction pyramid direction to normal force. Unit vector specifying first friction pyramid direction in collision-fixed reference frame. If the friction pyramid model is in use, and this value is set to a unit vector for one of the colliding surfaces, the ODE Collide callback function will align the friction pyramid directions with a reference frame fixed to that collision surface. If both surfaces have this value set to a vector of zeros, the friction pyramid directions will be aligned with the world frame. If this value is set for both surfaces, the behavior is undefined. Force dependent slip in first friction pyramid direction, equivalent to inverse of viscous damping coefficient with units of m/s/N. A slip value of 0 is infinitely viscous. Force dependent slip in second friction pyramid direction, equivalent to inverse of viscous damping coefficient with units of m/s/N. A slip value of 0 is infinitely viscous. Coefficient of friction in first friction pyramid direction, the unitless maximum ratio of force in first friction pyramid direction to normal force. Coefficient of friction in second friction pyramid direction, the unitless maximum ratio of force in second friction pyramid direction to normal force. Unit vector specifying first friction pyramid direction in collision-fixed reference frame. If the friction pyramid model is in use, and this value is set to a unit vector for one of the colliding surfaces, the friction pyramid directions will be aligned with a reference frame fixed to that collision surface. If both surfaces have this value set to a vector of zeros, the friction pyramid directions will be aligned with the world frame. If this value is set for both surfaces, the behavior is undefined. Coefficient of rolling friction Flag to disable contact force generation, while still allowing collision checks and contact visualization to occur. Bitmask for collision filtering when collide_without_contact is on Bitmask for collision filtering. This will override collide_without_contact Bitmask for category of collision filtering. Collision happens if ((category1 & collision2) | (category2 & collision1)) is not zero. If not specified, the category_bitmask should be interpreted as being the same as collide_bitmask. Poisson's ratio is the unitless ratio between transverse and axial strain. This value must lie between (-1, 0.5). Defaults to 0.3 for typical steel. Note typical silicone elastomers have Poisson's ratio near 0.49 ~ 0.50. For reference, approximate values for Material:(Young's Modulus, Poisson's Ratio) for some of the typical materials are: Plastic: (1e8 ~ 3e9 Pa, 0.35 ~ 0.41), Wood: (4e9 ~ 1e10 Pa, 0.22 ~ 0.50), Aluminum: (7e10 Pa, 0.32 ~ 0.35), Steel: (2e11 Pa, 0.26 ~ 0.31). Young's Modulus in SI derived unit Pascal. Defaults to -1. If value is less or equal to zero, contact using elastic modulus (with Poisson's Ratio) is disabled. For reference, approximate values for Material:(Young's Modulus, Poisson's Ratio) for some of the typical materials are: Plastic: (1e8 ~ 3e9 Pa, 0.35 ~ 0.41), Wood: (4e9 ~ 1e10 Pa, 0.22 ~ 0.50), Aluminum: (7e10 Pa, 0.32 ~ 0.35), Steel: (2e11 Pa, 0.26 ~ 0.31). ODE contact parameters Soft constraint force mixing. Soft error reduction parameter dynamically "stiffness"-equivalent coefficient for contact joints dynamically "damping"-equivalent coefficient for contact joints maximum contact correction velocity truncation term. minimum allowable depth before contact correction impulse is applied Bullet contact parameters Soft constraint force mixing. Soft error reduction parameter dynamically "stiffness"-equivalent coefficient for contact joints dynamically "damping"-equivalent coefficient for contact joints Similar to ODE's max_vel implementation. See http://bulletphysics.org/mediawiki-1.5.8/index.php/BtContactSolverInfo#Split_Impulse for more information. Similar to ODE's max_vel implementation. See http://bulletphysics.org/mediawiki-1.5.8/index.php/BtContactSolverInfo#Split_Impulse for more information. soft contact pamameters based on paper: http://www.cc.gatech.edu/graphics/projects/Sumit/homepage/papers/sigasia11/jain_softcontacts_siga11.pdf This is variable k_v in the soft contacts paper. Its unit is N/m. This is variable k_e in the soft contacts paper. Its unit is N/m. Viscous damping of point velocity in body frame. Its unit is N/m/s. Fraction of mass to be distributed among deformable nodes. The visual properties of the link. This element specifies the shape of the object (box, cylinder, etc.) for visualization purposes. Unique name for the visual element within the scope of the parent link. If true the visual will cast shadows. will be implemented in the future release. The amount of transparency( 0=opaque, 1 = fully transparent) Optional meta information for the visual. The information contained within this element should be used to provide additional feedback to an end user. The layer in which this visual is displayed. The layer number is useful for programs, such as Gazebo, that put visuals in different layers for enhanced visualization. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. The material of the visual element. Name of material from an installed script file. This will override the color element if the script exists. URI of the material script file Name of the script within the script file vertex, pixel, normal_map_objectspace, normal_map_tangentspace filename of the normal map If false, dynamic lighting will be disabled The ambient color of a material specified by set of four numbers representing red/green/blue, each in the range of [0,1]. The diffuse color of a material specified by set of four numbers representing red/green/blue/alpha, each in the range of [0,1]. The specular color of a material specified by set of four numbers representing red/green/blue/alpha, each in the range of [0,1]. The emissive color of a material specified by set of four numbers representing red/green/blue, each in the range of [0,1]. Physically Based Rendering (PBR) material. There are two PBR workflows: metal and specular. While both workflows and their parameters can be specified at the same time, typically only one of them will be used (depending on the underlying renderer capability). It is also recommended to use the same workflow for all materials in the world. PBR using the Metallic/Roughness workflow. Filename of the diffuse/albedo map. Filename of the roughness map. Material roughness in the range of [0,1], where 0 represents a smooth surface and 1 represents a rough surface. This is the inverse of a specular map in a PBR specular workflow. Filename of the metalness map. Material metalness in the range of [0,1], where 0 represents non-metal and 1 represents raw metal Filename of the environment / reflection map, typically in the form of a cubemap Filename of the ambient occlusion map. The map defines the amount of ambient lighting on the surface. Filename of the normal map. The normals can be in the object space or tangent space as specified in the 'type' attribute The space that the normals are in. Values are: 'object' or 'tangent' Filename of the emissive map. PBR using the Specular/Glossiness workflow. Filename of the diffuse/albedo map. Filename of the specular map. Filename of the glossiness map. Material glossiness in the range of [0-1], where 0 represents a rough surface and 1 represents a smooth surface. This is the inverse of a roughness map in a PBR metal workflow. Filename of the ambient occlusion map. The map defines the amount of ambient lighting on the surface. Filename of the normal map. The normals can be in the object space or tangent space as specified in the 'type' attribute The space that the normals are in. Values are: 'object' or 'tangent' Filename of the emissive map. The shape of the visual or collision object. You can use the empty tag to make empty geometries. Box shape The three side lengths of the box. The origin of the box is in its geometric center (inside the center of the box). Cylinder shape Radius of the cylinder Length of the cylinder A heightmap based on a 2d grayscale image. URI to a grayscale image file The size of the heightmap in world units. When loading an image: "size" is used if present, otherwise defaults to 1x1x1. When loading a DEM: "size" is used if present, otherwise defaults to true size of DEM. A position offset. The heightmap can contain multiple textures. The order of the texture matters. The first texture will appear at the lowest height, and the last texture at the highest height. Use blend to control the height thresholds and fade between textures. Size of the applied texture in meters. Diffuse texture image filename Normalmap texture image filename The blend tag controls how two adjacent textures are mixed. The number of blend elements should equal one less than the number of textures. Min height of a blend layer Distance over which the blend occurs Set if the rendering engine will use terrain paging Samples per heightmap datum. For rasterized heightmaps, this indicates the number of samples to take per pixel. Using a lower value, e.g. 1, will generally improve the performance of the heightmap but lower the heightmap quality. Extrude a set of boxes from a grayscale image. URI of the grayscale image file Scaling factor applied to the image Grayscale threshold Height of the extruded boxes The amount of error in the model Mesh shape Mesh uri Use a named submesh. The submesh must exist in the mesh specified by the uri Name of the submesh within the parent mesh Set to true to center the vertices of the submesh at 0,0,0. This will effectively remove any transformations on the submesh before the poses from parent links and models are applied. Scaling factor applied to the mesh Plane shape Normal direction for the plane. When a Plane is used as a geometry for a Visual or Collision object, then the normal is specified in the Visual or Collision frame, respectively. Length of each side of the plane. Note that this property is meaningful only for visualizing the Plane, i.e., when the Plane is used as a geometry for a Visual object. The Plane has infinite size when used as a geometry for a Collision object. Defines an extruded polyline shape A series of points that define the path of the polyline. Height of the polyline Sphere shape radius of the sphere A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. The sensor tag describes the type and properties of a sensor. A unique name for the sensor. This name must not match another model in the model. The type name of the sensor. By default, SDFormat supports types air_pressure, altimeter, camera, contact, depth_camera, force_torque, gps, gpu_lidar, gpu_ray, imu, lidar, logical_camera, magnetometer, multicamera, ray, rfid, rfidtag, rgbd_camera, sonar, thermal_camera, wireless_receiver, and wireless_transmitter. The "ray" and "gpu_ray" types are equivalent to "lidar" and "gpu_lidar", respectively. It is preferred to use "lidar" and "gpu_lidar" since "ray" and "gpu_ray" will be deprecated. The "ray" and "gpu_ray" types are maintained for legacy support. If true the sensor will always be updated according to the update rate. The frequency at which the sensor data is generated. If left unspecified, the sensor will generate data every cycle. If true, the sensor is visualized in the GUI Name of the topic on which data is published. This is necessary for visualization A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. These elements are specific to an air pressure sensor. The initial altitude in meters. This value can be used by a sensor implementation to augment the altitude of the sensor. For example, if you are using simulation instead of creating a 1000 m mountain model on which to place your sensor, you could instead set this value to 1000 and place your model on a ground plane with a Z height of zero. Noise parameters for the pressure data. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to an altimeter sensor. Noise parameters for vertical position The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical velocity The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to camera sensors. An optional name for the camera. Horizontal field of view The image size in pixels and format. Width in pixels Height in pixels (L8|R8G8B8|B8G8R8|BAYER_RGGB8|BAYER_BGGR8|BAYER_GBRG8|BAYER_GRBG8) The near and far clip planes. Objects closer or farther than these planes are not rendered. Near clipping plane Far clipping plane Enable or disable saving of camera frames. True = saving enabled The path name which will hold the frame data. If path name is relative, then directory is relative to current working directory. Depth camera parameters Type of output The near and far clip planes. Objects closer or farther than these planes are not detected by the depth camera. Near clipping plane for depth camera Far clipping plane for depth camera The properties of the noise model that should be applied to generated images The type of noise. Currently supported types are: "gaussian" (draw additive noise values independently for each pixel from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. Lens distortion to be applied to camera images. See http://en.wikipedia.org/wiki/Distortion_(optics)#Software_correction The radial distortion coefficient k1 The radial distortion coefficient k2 The radial distortion coefficient k3 The tangential distortion coefficient p1 The tangential distortion coefficient p2 The distortion center or principal point Lens projection description Type of the lens mapping. Supported values are gnomonical, stereographic, equidistant, equisolid_angle, orthographic, custom. For gnomonical (perspective) projection, it is recommended to specify a horizontal_fov of less than or equal to 90° If true the image will be scaled to fit horizontal FOV, otherwise it will be shown according to projection type parameters Definition of custom mapping function in a form of r=c1*f*fun(theta/c2 + c3). See https://en.wikipedia.org/wiki/Fisheye_lens#Mapping_function Linear scaling constant Angle scaling constant Angle offset constant Focal length of the optical system. Note: It's not a focal length of the lens in a common sense! This value is ignored if 'scale_to_fov' is set to true Possible values are 'sin', 'tan' and 'id' Everything outside of the specified angle will be hidden, 90° by default Resolution of the environment cube map used to draw the world Camera intrinsic parameters for setting a custom perspective projection matrix (cannot be used with WideAngleCamera since this class uses image stitching from 6 different cameras for achieving a wide field of view). The focal lengths can be computed using focal_length_in_pixels = (image_width_in_pixels * 0.5) / tan(field_of_view_in_degrees * 0.5 * PI/180) X focal length (in pixels, overrides horizontal_fov) Y focal length (in pixels, overrides horizontal_fov) X principal point (in pixels) Y principal point (in pixels) XY axis skew A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. These elements are specific to the contact sensor. name of the collision element within a link that acts as the contact sensor. Topic on which contact data is published. These elements are specific to the GPS sensor. Parameters related to GPS position measurement. Noise parameters for horizontal position measurement, in units of meters. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical position measurement, in units of meters. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to GPS position measurement. Noise parameters for horizontal velocity measurement, in units of meters/second. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical velocity measurement, in units of meters/second. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the IMU sensor. This string represents special hardcoded use cases that are commonly seen with typical robot IMU's: - CUSTOM: use Euler angle custom_rpy orientation specification. The orientation of the IMU's reference frame is defined by adding the custom_rpy rotation to the parent_frame. - NED: The IMU XYZ aligns with NED, where NED orientation relative to Gazebo world is defined by the SphericalCoordinates class. - ENU: The IMU XYZ aligns with ENU, where ENU orientation relative to Gazebo world is defined by the SphericalCoordinates class. - NWU: The IMU XYZ aligns with NWU, where NWU orientation relative to Gazebo world is defined by the SphericalCoordinates class. - GRAV_UP: where direction of gravity maps to IMU reference frame Z-axis with Z-axis pointing in the opposite direction of gravity. IMU reference frame X-axis direction is defined by grav_dir_x. Note if grav_dir_x is parallel to gravity direction, this configuration fails. Otherwise, IMU reference frame X-axis is defined by projection of grav_dir_x onto a plane normal to the gravity vector. IMU reference frame Y-axis is a vector orthogonal to both X and Z axis following the right hand rule. - GRAV_DOWN: where direction of gravity maps to IMU reference frame Z-axis with Z-axis pointing in the direction of gravity. IMU reference frame X-axis direction is defined by grav_dir_x. Note if grav_dir_x is parallel to gravity direction, this configuration fails. Otherwise, IMU reference frame X-axis is defined by projection of grav_dir_x onto a plane normal to the gravity vector. IMU reference frame Y-axis is a vector orthogonal to both X and Z axis following the right hand rule. This field and parent_frame are used when localization is set to CUSTOM. Orientation (fixed axis roll, pitch yaw) transform from parent_frame to this IMU's reference frame. Some common examples are: - IMU reports in its local frame on boot. IMU sensor frame is the reference frame. Example: parent_frame="", custom_rpy="0 0 0" - IMU reports in Gazebo world frame. Example sdf: parent_frame="world", custom_rpy="0 0 0" - IMU reports in NWU frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between North-West-Up and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="0 0 0" - IMU reports in NED frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between North-East-Down and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="M_PI 0 0" - IMU reports in ENU frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between East-North-Up and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="0 0 -0.5*M_PI" - IMU reports in ROS optical frame as described in http://www.ros.org/reps/rep-0103.html#suffix-frames, which is (z-forward, x-left to right when facing +z, y-top to bottom when facing +z). (default gazebo camera is +x:view direction, +y:left, +z:up). Example sdf: parent_frame="local", custom_rpy="-0.5*M_PI 0 -0.5*M_PI" Name of parent frame which the custom_rpy transform is defined relative to. It can be any valid fully scoped Gazebo Link name or the special reserved "world" frame. If left empty, use the sensor's own local frame. Used when localization is set to GRAV_UP or GRAV_DOWN, a projection of this vector into a plane that is orthogonal to the gravity vector defines the direction of the IMU reference frame's X-axis. grav_dir_x is defined in the coordinate frame as defined by the parent_frame element. Name of parent frame in which the grav_dir_x vector is defined. It can be any valid fully scoped Gazebo Link name or the special reserved "world" frame. If left empty, use the sensor's own local frame. Topic on which data is published. DEPRECATED. Use the topic element that is a child of the sensor element. These elements are specific to body-frame angular velocity, which is expressed in radians per second Angular velocity about the X axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Angular velocity about the Y axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Angular velocity about the Z axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to body-frame linear acceleration, which is expressed in meters per second squared Linear acceleration about the X axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Linear acceleration about the Y axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Linear acceleration about the Z axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the lidar sensor. The number of simulated lidar rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle The number of simulated lidar rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle specifies range properties of each simulated lidar The minimum distance for each lidar ray. The maximum distance for each lidar ray. Linear resolution of each lidar ray. The properties of the noise model that should be applied to generated scans The type of noise. Currently supported types are: "gaussian" (draw noise values independently for each beam from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. These elements are specific to logical camera sensors. A logical camera reports objects that fall within a frustum. Computation should be performed on the CPU. Near clipping distance of the view frustum Far clipping distance of the view frustum Aspect ratio of the near and far planes. This is the width divided by the height of the near or far planes. Horizontal field of view of the frustum, in radians. This is the angle between the frustum's vertex and the edges of the near or far plane. These elements are specific to a Magnetometer sensor. Parameters related to the body-frame X axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to the body-frame Y axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to the body-frame Z axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the ray (laser) sensor. The number of simulated rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle The number of simulated rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle specifies range properties of each simulated ray The minimum distance for each ray. The maximum distance for each ray. Linear resolution of each ray. The properties of the noise model that should be applied to generated scans The type of noise. Currently supported types are: "gaussian" (draw noise values independently for each beam from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. These elements are specific to the sonar sensor. The sonar collision shape. Currently supported geometries are: "cone" and "sphere". Minimum range Max range Radius of the sonar cone at max range. This parameter is only used if geometry is "cone". These elements are specific to a wireless transceiver. Service set identifier (network name) Specifies the frequency of transmission in MHz Only a frequency range is filtered. Here we set the lower bound (MHz). Only a frequency range is filtered. Here we set the upper bound (MHz). Specifies the antenna gain in dBi Specifies the transmission power in dBm Mininum received signal power in dBm These elements are specific to the force torque sensor. Frame in which to report the wrench values. Currently supported frames are: "parent" report the wrench expressed in the orientation of the parent link frame, "child" report the wrench expressed in the orientation of the child link frame, "sensor" report the wrench expressed in the orientation of the joint sensor frame. Note that for each option the point with respect to which the torque component of the wrench is expressed is the joint origin. Direction of the wrench measured by the sensor. The supported options are: "parent_to_child" if the measured wrench is the one applied by parent link on the child link, "child_to_parent" if the measured wrench is the one applied by the child link on the parent link. Name of the projector Texture name Field of view Near clip distance far clip distance A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. An audio sink. An audio source. URI of the audio media. Pitch for the audio media, in Hz Gain for the audio media, in dB. List of collision objects that will trigger audio playback. Name of child collision element that will trigger audio playback. True to make the audio source loop playback. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. Description of a battery. Unique name for the battery. Initial voltage in volts. The light element describes a light source. A unique name for the light. The light type: point, directional, spot. When true, the light will cast shadows. Diffuse light color Specular light color Light attenuation Range of the light The linear attenuation factor: 1 means attenuate evenly over the distance. The constant attenuation factor: 1.0 means never attenuate, 0.0 is complete attenutation. The quadratic attenuation factor: adds a curvature to the attenuation. Direction of the light, only applicable for spot and directional lights. Spot light parameters Angle covered by the bright inner cone Angle covered by the outer cone The rate of falloff between the inner and outer cones. 1.0 means a linear falloff, less means slower falloff, higher means faster falloff. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A joint connects two links with kinematic and dynamic properties. By default, the pose of a joint is expressed in the child link frame. A unique name for the joint within the scope of the model. The type of joint, which must be one of the following: (continuous) a hinge joint that rotates on a single axis with a continuous range of motion, (revolute) a hinge joint that rotates on a single axis with a fixed range of motion, (gearbox) geared revolute joints, (revolute2) same as two revolute joints connected in series, (prismatic) a sliding joint that slides along an axis with a limited range specified by upper and lower limits, (ball) a ball and socket joint, (screw) a single degree of freedom joint with coupled sliding and rotational motion, (universal) like a ball joint, but constrains one degree of freedom, (fixed) a joint with zero degrees of freedom that rigidly connects two links. Name of the parent link Name of the child link Parameter for gearbox joints. Given theta_1 and theta_2 defined in description for gearbox_reference_body, theta_2 = -gearbox_ratio * theta_1. Parameter for gearbox joints. Gearbox ratio is enforced over two joint angles. First joint angle (theta_1) is the angle from the gearbox_reference_body to the parent link in the direction of the axis element and the second joint angle (theta_2) is the angle from the gearbox_reference_body to the child link in the direction of the axis2 element. Parameter for screw joints. Parameters related to the axis of rotation for revolute joints, the axis of translation for prismatic joints. Default joint position for this joint axis. Represents the x,y,z components of the axis unit vector. The axis is expressed in the joint frame unless the use_parent_model_frame flag is set to true. The vector should be normalized. Flag to interpret the axis xyz element in the parent model frame instead of joint frame. Provided for Gazebo compatibility (see https://bitbucket.org/osrf/gazebo/issue/494 ). An element specifying physical properties of the joint. These values are used to specify modeling properties of the joint, particularly useful for simulation. The physical velocity dependent viscous damping coefficient of the joint. The physical static friction value of the joint. The spring reference position for this joint axis. The spring stiffness for this joint axis. specifies the limits of this joint Specifies the lower joint limit (radians for revolute joints, meters for prismatic joints). Omit if joint is continuous. Specifies the upper joint limit (radians for revolute joints, meters for prismatic joints). Omit if joint is continuous. A value for enforcing the maximum joint effort applied. Limit is not enforced if value is negative. A value for enforcing the maximum joint velocity. Joint stop stiffness. Joint stop dissipation. Parameters related to the second axis of rotation for revolute2 joints and universal joints. Default joint position for this joint axis. Represents the x,y,z components of the axis unit vector. The axis is expressed in the joint frame unless the use_parent_model_frame flag is set to true. The vector should be normalized. Flag to interpret the axis xyz element in the parent model frame instead of joint frame. Provided for Gazebo compatibility (see https://bitbucket.org/osrf/gazebo/issue/494 ). An element specifying physical properties of the joint. These values are used to specify modeling properties of the joint, particularly useful for simulation. The physical velocity dependent viscous damping coefficient of the joint. EXPERIMENTAL: if damping coefficient is negative and implicit_spring_damper is true, adaptive damping is used. The physical static friction value of the joint. The spring reference position for this joint axis. The spring stiffness for this joint axis. An attribute specifying the lower joint limit (radians for revolute joints, meters for prismatic joints). Omit if joint is continuous. An attribute specifying the upper joint limit (radians for revolute joints, meters for prismatic joints). Omit if joint is continuous. An attribute for enforcing the maximum joint effort applied by Joint::SetForce. Limit is not enforced if value is negative. (not implemented) An attribute for enforcing the maximum joint velocity. Joint stop stiffness. Supported physics engines: SimBody. Joint stop dissipation. Supported physics engines: SimBody. Parameters that are specific to a certain physics engine. Simbody specific parameters Force cut in the multibody graph at this joint. ODE specific parameters (DEPRECATION WARNING: In SDFormat 1.5 this tag will be replaced by the same tag directly under the physics-block. For now, this tag overrides the one outside of ode-block, but in SDFormat 1.5 this tag will be removed completely.) If provide feedback is set to true, ODE will compute the constraint forces at this joint. If cfm damping is set to true, ODE will use CFM to simulate damping, allows for infinite damping, and one additional constraint row (previously used for joint limit) is always active. If implicit_spring_damper is set to true, ODE will use CFM, ERP to simulate stiffness and damping, allows for infinite damping, and one additional constraint row (previously used for joint limit) is always active. This replaces cfm_damping parameter in SDFormat 1.4. Scale the excess for in a joint motor at joint limits. Should be between zero and one. Constraint force mixing for constrained directions Error reduction parameter for constrained directions Bounciness of the limits Maximum force or torque used to reach the desired velocity. The desired velocity of the joint. Should only be set if you want the joint to move on load. Constraint force mixing parameter used by the joint stop Error reduction parameter used by the joint stop Suspension constraint force mixing parameter Suspension error reduction parameter If provide feedback is set to true, physics engine will compute the constraint forces at this joint. For now, provide_feedback under ode block will override this tag and given user warning about the migration. provide_feedback under ode is scheduled to be removed in SDFormat 1.5. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. The sensor tag describes the type and properties of a sensor. A unique name for the sensor. This name must not match another model in the model. The type name of the sensor. By default, SDFormat supports types air_pressure, altimeter, camera, contact, depth_camera, force_torque, gps, gpu_lidar, gpu_ray, imu, lidar, logical_camera, magnetometer, multicamera, ray, rfid, rfidtag, rgbd_camera, sonar, thermal_camera, wireless_receiver, and wireless_transmitter. The "ray" and "gpu_ray" types are equivalent to "lidar" and "gpu_lidar", respectively. It is preferred to use "lidar" and "gpu_lidar" since "ray" and "gpu_ray" will be deprecated. The "ray" and "gpu_ray" types are maintained for legacy support. If true the sensor will always be updated according to the update rate. The frequency at which the sensor data is generated. If left unspecified, the sensor will generate data every cycle. If true, the sensor is visualized in the GUI Name of the topic on which data is published. This is necessary for visualization A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. These elements are specific to an air pressure sensor. The initial altitude in meters. This value can be used by a sensor implementation to augment the altitude of the sensor. For example, if you are using simulation instead of creating a 1000 m mountain model on which to place your sensor, you could instead set this value to 1000 and place your model on a ground plane with a Z height of zero. Noise parameters for the pressure data. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to an altimeter sensor. Noise parameters for vertical position The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical velocity The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to camera sensors. An optional name for the camera. Horizontal field of view The image size in pixels and format. Width in pixels Height in pixels (L8|R8G8B8|B8G8R8|BAYER_RGGB8|BAYER_BGGR8|BAYER_GBRG8|BAYER_GRBG8) The near and far clip planes. Objects closer or farther than these planes are not rendered. Near clipping plane Far clipping plane Enable or disable saving of camera frames. True = saving enabled The path name which will hold the frame data. If path name is relative, then directory is relative to current working directory. Depth camera parameters Type of output The near and far clip planes. Objects closer or farther than these planes are not detected by the depth camera. Near clipping plane for depth camera Far clipping plane for depth camera The properties of the noise model that should be applied to generated images The type of noise. Currently supported types are: "gaussian" (draw additive noise values independently for each pixel from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. Lens distortion to be applied to camera images. See http://en.wikipedia.org/wiki/Distortion_(optics)#Software_correction The radial distortion coefficient k1 The radial distortion coefficient k2 The radial distortion coefficient k3 The tangential distortion coefficient p1 The tangential distortion coefficient p2 The distortion center or principal point Lens projection description Type of the lens mapping. Supported values are gnomonical, stereographic, equidistant, equisolid_angle, orthographic, custom. For gnomonical (perspective) projection, it is recommended to specify a horizontal_fov of less than or equal to 90° If true the image will be scaled to fit horizontal FOV, otherwise it will be shown according to projection type parameters Definition of custom mapping function in a form of r=c1*f*fun(theta/c2 + c3). See https://en.wikipedia.org/wiki/Fisheye_lens#Mapping_function Linear scaling constant Angle scaling constant Angle offset constant Focal length of the optical system. Note: It's not a focal length of the lens in a common sense! This value is ignored if 'scale_to_fov' is set to true Possible values are 'sin', 'tan' and 'id' Everything outside of the specified angle will be hidden, 90° by default Resolution of the environment cube map used to draw the world Camera intrinsic parameters for setting a custom perspective projection matrix (cannot be used with WideAngleCamera since this class uses image stitching from 6 different cameras for achieving a wide field of view). The focal lengths can be computed using focal_length_in_pixels = (image_width_in_pixels * 0.5) / tan(field_of_view_in_degrees * 0.5 * PI/180) X focal length (in pixels, overrides horizontal_fov) Y focal length (in pixels, overrides horizontal_fov) X principal point (in pixels) Y principal point (in pixels) XY axis skew A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. These elements are specific to the contact sensor. name of the collision element within a link that acts as the contact sensor. Topic on which contact data is published. These elements are specific to the GPS sensor. Parameters related to GPS position measurement. Noise parameters for horizontal position measurement, in units of meters. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical position measurement, in units of meters. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to GPS position measurement. Noise parameters for horizontal velocity measurement, in units of meters/second. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical velocity measurement, in units of meters/second. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the IMU sensor. This string represents special hardcoded use cases that are commonly seen with typical robot IMU's: - CUSTOM: use Euler angle custom_rpy orientation specification. The orientation of the IMU's reference frame is defined by adding the custom_rpy rotation to the parent_frame. - NED: The IMU XYZ aligns with NED, where NED orientation relative to Gazebo world is defined by the SphericalCoordinates class. - ENU: The IMU XYZ aligns with ENU, where ENU orientation relative to Gazebo world is defined by the SphericalCoordinates class. - NWU: The IMU XYZ aligns with NWU, where NWU orientation relative to Gazebo world is defined by the SphericalCoordinates class. - GRAV_UP: where direction of gravity maps to IMU reference frame Z-axis with Z-axis pointing in the opposite direction of gravity. IMU reference frame X-axis direction is defined by grav_dir_x. Note if grav_dir_x is parallel to gravity direction, this configuration fails. Otherwise, IMU reference frame X-axis is defined by projection of grav_dir_x onto a plane normal to the gravity vector. IMU reference frame Y-axis is a vector orthogonal to both X and Z axis following the right hand rule. - GRAV_DOWN: where direction of gravity maps to IMU reference frame Z-axis with Z-axis pointing in the direction of gravity. IMU reference frame X-axis direction is defined by grav_dir_x. Note if grav_dir_x is parallel to gravity direction, this configuration fails. Otherwise, IMU reference frame X-axis is defined by projection of grav_dir_x onto a plane normal to the gravity vector. IMU reference frame Y-axis is a vector orthogonal to both X and Z axis following the right hand rule. This field and parent_frame are used when localization is set to CUSTOM. Orientation (fixed axis roll, pitch yaw) transform from parent_frame to this IMU's reference frame. Some common examples are: - IMU reports in its local frame on boot. IMU sensor frame is the reference frame. Example: parent_frame="", custom_rpy="0 0 0" - IMU reports in Gazebo world frame. Example sdf: parent_frame="world", custom_rpy="0 0 0" - IMU reports in NWU frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between North-West-Up and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="0 0 0" - IMU reports in NED frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between North-East-Down and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="M_PI 0 0" - IMU reports in ENU frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between East-North-Up and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="0 0 -0.5*M_PI" - IMU reports in ROS optical frame as described in http://www.ros.org/reps/rep-0103.html#suffix-frames, which is (z-forward, x-left to right when facing +z, y-top to bottom when facing +z). (default gazebo camera is +x:view direction, +y:left, +z:up). Example sdf: parent_frame="local", custom_rpy="-0.5*M_PI 0 -0.5*M_PI" Name of parent frame which the custom_rpy transform is defined relative to. It can be any valid fully scoped Gazebo Link name or the special reserved "world" frame. If left empty, use the sensor's own local frame. Used when localization is set to GRAV_UP or GRAV_DOWN, a projection of this vector into a plane that is orthogonal to the gravity vector defines the direction of the IMU reference frame's X-axis. grav_dir_x is defined in the coordinate frame as defined by the parent_frame element. Name of parent frame in which the grav_dir_x vector is defined. It can be any valid fully scoped Gazebo Link name or the special reserved "world" frame. If left empty, use the sensor's own local frame. Topic on which data is published. DEPRECATED. Use the topic element that is a child of the sensor element. These elements are specific to body-frame angular velocity, which is expressed in radians per second Angular velocity about the X axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Angular velocity about the Y axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Angular velocity about the Z axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to body-frame linear acceleration, which is expressed in meters per second squared Linear acceleration about the X axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Linear acceleration about the Y axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Linear acceleration about the Z axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the lidar sensor. The number of simulated lidar rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle The number of simulated lidar rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle specifies range properties of each simulated lidar The minimum distance for each lidar ray. The maximum distance for each lidar ray. Linear resolution of each lidar ray. The properties of the noise model that should be applied to generated scans The type of noise. Currently supported types are: "gaussian" (draw noise values independently for each beam from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. These elements are specific to logical camera sensors. A logical camera reports objects that fall within a frustum. Computation should be performed on the CPU. Near clipping distance of the view frustum Far clipping distance of the view frustum Aspect ratio of the near and far planes. This is the width divided by the height of the near or far planes. Horizontal field of view of the frustum, in radians. This is the angle between the frustum's vertex and the edges of the near or far plane. These elements are specific to a Magnetometer sensor. Parameters related to the body-frame X axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to the body-frame Y axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to the body-frame Z axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the ray (laser) sensor. The number of simulated rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle The number of simulated rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle specifies range properties of each simulated ray The minimum distance for each ray. The maximum distance for each ray. Linear resolution of each ray. The properties of the noise model that should be applied to generated scans The type of noise. Currently supported types are: "gaussian" (draw noise values independently for each beam from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. These elements are specific to the sonar sensor. The sonar collision shape. Currently supported geometries are: "cone" and "sphere". Minimum range Max range Radius of the sonar cone at max range. This parameter is only used if geometry is "cone". These elements are specific to a wireless transceiver. Service set identifier (network name) Specifies the frequency of transmission in MHz Only a frequency range is filtered. Here we set the lower bound (MHz). Only a frequency range is filtered. Here we set the upper bound (MHz). Specifies the antenna gain in dBi Specifies the transmission power in dBm Mininum received signal power in dBm These elements are specific to the force torque sensor. Frame in which to report the wrench values. Currently supported frames are: "parent" report the wrench expressed in the orientation of the parent link frame, "child" report the wrench expressed in the orientation of the child link frame, "sensor" report the wrench expressed in the orientation of the joint sensor frame. Note that for each option the point with respect to which the torque component of the wrench is expressed is the joint origin. Direction of the wrench measured by the sensor. The supported options are: "parent_to_child" if the measured wrench is the one applied by parent link on the child link, "child_to_parent" if the measured wrench is the one applied by the child link on the parent link. A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. A special kind of model which can have a scripted motion. This includes both global waypoint type animations and skeleton animations. A unique name for the actor. (DEPRECATION WARNING: This is deprecated in 1.6 and removed in 1.7. Actors should be static, so this is always true. Skin file which defines a visual and the underlying skeleton which moves it. Path to skin file, accepted formats: COLLADA, BVH. Scale the skin's size. Animation file defines an animation for the skeleton in the skin. The skeleton must be compatible with the skin skeleton. Unique name for animation. Path to animation file. Accepted formats: COLLADA, BVH. Scale for the animation skeleton. Set to true so the animation is interpolated on X. Adds scripted trajectories to the actor. Set this to true for the script to be repeated in a loop. For a fluid continuous motion, make sure the last waypoint matches the first one. This is the time to wait before starting the script. If running in a loop, this time will be waited before starting each cycle. Set to true if the animation should start as soon as the simulation starts playing. It is useful to set this to false if the animation should only start playing only when triggered by a plugin, for example. The trajectory contains a series of keyframes to be followed. Unique id for a trajectory. If it matches the type of an animation, they will be played at the same time. The tension of the trajectory spline. The default value of zero equates to a Catmull-Rom spline, which may also cause the animation to overshoot keyframes. A value of one will cause the animation to stick to the keyframes. Each point in the trajectory. The time in seconds, counted from the beginning of the script, when the pose should be reached. The pose which should be reached at the given time. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A physical link with inertia, collision, and visual properties. A link must be a child of a model, and any number of links may exist in a model. A unique name for the link within the scope of the model. If true, the link is affected by gravity. If true, the link is affected by the wind. If true, the link can collide with other links in the model. Two links within a model will collide if link1.self_collide OR link2.self_collide. Links connected by a joint will never collide. If true, the link is kinematic only If true, the link will have 6DOF and be a direct child of world. Exponential damping of the link's velocity. Linear damping Angular damping A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. The inertial properties of the link. The mass of the link. The 3x3 rotational inertia matrix. Because the rotational inertia matrix is symmetric, only 6 above-diagonal elements of this matrix are specified here, using the attributes ixx, ixy, ixz, iyy, iyz, izz. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. This is the pose of the inertial reference frame, relative to the specified reference frame. The origin of the inertial reference frame needs to be at the center of gravity. The axes of the inertial reference frame do not need to be aligned with the principal axes of the inertia. Name of frame which the pose is defined relative to. The collision properties of a link. Note that this can be different from the visual properties of a link, for example, simpler collision models are often used to reduce computation time. Unique name for the collision element within the scope of the parent link. intensity value returned by laser sensor. Maximum number of contacts allowed between two entities. This value overrides the max_contacts element defined in physics. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. The shape of the visual or collision object. You can use the empty tag to make empty geometries. Box shape The three side lengths of the box. The origin of the box is in its geometric center (inside the center of the box). Cylinder shape Radius of the cylinder Length of the cylinder A heightmap based on a 2d grayscale image. URI to a grayscale image file The size of the heightmap in world units. When loading an image: "size" is used if present, otherwise defaults to 1x1x1. When loading a DEM: "size" is used if present, otherwise defaults to true size of DEM. A position offset. The heightmap can contain multiple textures. The order of the texture matters. The first texture will appear at the lowest height, and the last texture at the highest height. Use blend to control the height thresholds and fade between textures. Size of the applied texture in meters. Diffuse texture image filename Normalmap texture image filename The blend tag controls how two adjacent textures are mixed. The number of blend elements should equal one less than the number of textures. Min height of a blend layer Distance over which the blend occurs Set if the rendering engine will use terrain paging Samples per heightmap datum. For rasterized heightmaps, this indicates the number of samples to take per pixel. Using a lower value, e.g. 1, will generally improve the performance of the heightmap but lower the heightmap quality. Extrude a set of boxes from a grayscale image. URI of the grayscale image file Scaling factor applied to the image Grayscale threshold Height of the extruded boxes The amount of error in the model Mesh shape Mesh uri Use a named submesh. The submesh must exist in the mesh specified by the uri Name of the submesh within the parent mesh Set to true to center the vertices of the submesh at 0,0,0. This will effectively remove any transformations on the submesh before the poses from parent links and models are applied. Scaling factor applied to the mesh Plane shape Normal direction for the plane. When a Plane is used as a geometry for a Visual or Collision object, then the normal is specified in the Visual or Collision frame, respectively. Length of each side of the plane. Note that this property is meaningful only for visualizing the Plane, i.e., when the Plane is used as a geometry for a Visual object. The Plane has infinite size when used as a geometry for a Collision object. Defines an extruded polyline shape A series of points that define the path of the polyline. Height of the polyline Sphere shape radius of the sphere The surface parameters Bounciness coefficient of restitution, from [0...1], where 0=no bounciness. Bounce capture velocity, below which effective coefficient of restitution is 0. Parameters for torsional friction Torsional friction coefficient, unitless maximum ratio of tangential stress to normal stress. If this flag is true, torsional friction is calculated using the "patch_radius" parameter. If this flag is set to false, "surface_radius" (R) and contact depth (d) are used to compute the patch radius as sqrt(R*d). Radius of contact patch surface. Surface radius on the point of contact. Torsional friction parameters for ODE Force dependent slip for torsional friction, equivalent to inverse of viscous damping coefficient with units of rad/s/(Nm). A slip value of 0 is infinitely viscous. ODE friction parameters Coefficient of friction in first friction pyramid direction, the unitless maximum ratio of force in first friction pyramid direction to normal force. Coefficient of friction in second friction pyramid direction, the unitless maximum ratio of force in second friction pyramid direction to normal force. Unit vector specifying first friction pyramid direction in collision-fixed reference frame. If the friction pyramid model is in use, and this value is set to a unit vector for one of the colliding surfaces, the ODE Collide callback function will align the friction pyramid directions with a reference frame fixed to that collision surface. If both surfaces have this value set to a vector of zeros, the friction pyramid directions will be aligned with the world frame. If this value is set for both surfaces, the behavior is undefined. Force dependent slip in first friction pyramid direction, equivalent to inverse of viscous damping coefficient with units of m/s/N. A slip value of 0 is infinitely viscous. Force dependent slip in second friction pyramid direction, equivalent to inverse of viscous damping coefficient with units of m/s/N. A slip value of 0 is infinitely viscous. Coefficient of friction in first friction pyramid direction, the unitless maximum ratio of force in first friction pyramid direction to normal force. Coefficient of friction in second friction pyramid direction, the unitless maximum ratio of force in second friction pyramid direction to normal force. Unit vector specifying first friction pyramid direction in collision-fixed reference frame. If the friction pyramid model is in use, and this value is set to a unit vector for one of the colliding surfaces, the friction pyramid directions will be aligned with a reference frame fixed to that collision surface. If both surfaces have this value set to a vector of zeros, the friction pyramid directions will be aligned with the world frame. If this value is set for both surfaces, the behavior is undefined. Coefficient of rolling friction Flag to disable contact force generation, while still allowing collision checks and contact visualization to occur. Bitmask for collision filtering when collide_without_contact is on Bitmask for collision filtering. This will override collide_without_contact Bitmask for category of collision filtering. Collision happens if ((category1 & collision2) | (category2 & collision1)) is not zero. If not specified, the category_bitmask should be interpreted as being the same as collide_bitmask. Poisson's ratio is the unitless ratio between transverse and axial strain. This value must lie between (-1, 0.5). Defaults to 0.3 for typical steel. Note typical silicone elastomers have Poisson's ratio near 0.49 ~ 0.50. For reference, approximate values for Material:(Young's Modulus, Poisson's Ratio) for some of the typical materials are: Plastic: (1e8 ~ 3e9 Pa, 0.35 ~ 0.41), Wood: (4e9 ~ 1e10 Pa, 0.22 ~ 0.50), Aluminum: (7e10 Pa, 0.32 ~ 0.35), Steel: (2e11 Pa, 0.26 ~ 0.31). Young's Modulus in SI derived unit Pascal. Defaults to -1. If value is less or equal to zero, contact using elastic modulus (with Poisson's Ratio) is disabled. For reference, approximate values for Material:(Young's Modulus, Poisson's Ratio) for some of the typical materials are: Plastic: (1e8 ~ 3e9 Pa, 0.35 ~ 0.41), Wood: (4e9 ~ 1e10 Pa, 0.22 ~ 0.50), Aluminum: (7e10 Pa, 0.32 ~ 0.35), Steel: (2e11 Pa, 0.26 ~ 0.31). ODE contact parameters Soft constraint force mixing. Soft error reduction parameter dynamically "stiffness"-equivalent coefficient for contact joints dynamically "damping"-equivalent coefficient for contact joints maximum contact correction velocity truncation term. minimum allowable depth before contact correction impulse is applied Bullet contact parameters Soft constraint force mixing. Soft error reduction parameter dynamically "stiffness"-equivalent coefficient for contact joints dynamically "damping"-equivalent coefficient for contact joints Similar to ODE's max_vel implementation. See http://bulletphysics.org/mediawiki-1.5.8/index.php/BtContactSolverInfo#Split_Impulse for more information. Similar to ODE's max_vel implementation. See http://bulletphysics.org/mediawiki-1.5.8/index.php/BtContactSolverInfo#Split_Impulse for more information. soft contact pamameters based on paper: http://www.cc.gatech.edu/graphics/projects/Sumit/homepage/papers/sigasia11/jain_softcontacts_siga11.pdf This is variable k_v in the soft contacts paper. Its unit is N/m. This is variable k_e in the soft contacts paper. Its unit is N/m. Viscous damping of point velocity in body frame. Its unit is N/m/s. Fraction of mass to be distributed among deformable nodes. The visual properties of the link. This element specifies the shape of the object (box, cylinder, etc.) for visualization purposes. Unique name for the visual element within the scope of the parent link. If true the visual will cast shadows. will be implemented in the future release. The amount of transparency( 0=opaque, 1 = fully transparent) Optional meta information for the visual. The information contained within this element should be used to provide additional feedback to an end user. The layer in which this visual is displayed. The layer number is useful for programs, such as Gazebo, that put visuals in different layers for enhanced visualization. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. The material of the visual element. Name of material from an installed script file. This will override the color element if the script exists. URI of the material script file Name of the script within the script file vertex, pixel, normal_map_objectspace, normal_map_tangentspace filename of the normal map If false, dynamic lighting will be disabled The ambient color of a material specified by set of four numbers representing red/green/blue, each in the range of [0,1]. The diffuse color of a material specified by set of four numbers representing red/green/blue/alpha, each in the range of [0,1]. The specular color of a material specified by set of four numbers representing red/green/blue/alpha, each in the range of [0,1]. The emissive color of a material specified by set of four numbers representing red/green/blue, each in the range of [0,1]. Physically Based Rendering (PBR) material. There are two PBR workflows: metal and specular. While both workflows and their parameters can be specified at the same time, typically only one of them will be used (depending on the underlying renderer capability). It is also recommended to use the same workflow for all materials in the world. PBR using the Metallic/Roughness workflow. Filename of the diffuse/albedo map. Filename of the roughness map. Material roughness in the range of [0,1], where 0 represents a smooth surface and 1 represents a rough surface. This is the inverse of a specular map in a PBR specular workflow. Filename of the metalness map. Material metalness in the range of [0,1], where 0 represents non-metal and 1 represents raw metal Filename of the environment / reflection map, typically in the form of a cubemap Filename of the ambient occlusion map. The map defines the amount of ambient lighting on the surface. Filename of the normal map. The normals can be in the object space or tangent space as specified in the 'type' attribute The space that the normals are in. Values are: 'object' or 'tangent' Filename of the emissive map. PBR using the Specular/Glossiness workflow. Filename of the diffuse/albedo map. Filename of the specular map. Filename of the glossiness map. Material glossiness in the range of [0-1], where 0 represents a rough surface and 1 represents a smooth surface. This is the inverse of a roughness map in a PBR metal workflow. Filename of the ambient occlusion map. The map defines the amount of ambient lighting on the surface. Filename of the normal map. The normals can be in the object space or tangent space as specified in the 'type' attribute The space that the normals are in. Values are: 'object' or 'tangent' Filename of the emissive map. The shape of the visual or collision object. You can use the empty tag to make empty geometries. Box shape The three side lengths of the box. The origin of the box is in its geometric center (inside the center of the box). Cylinder shape Radius of the cylinder Length of the cylinder A heightmap based on a 2d grayscale image. URI to a grayscale image file The size of the heightmap in world units. When loading an image: "size" is used if present, otherwise defaults to 1x1x1. When loading a DEM: "size" is used if present, otherwise defaults to true size of DEM. A position offset. The heightmap can contain multiple textures. The order of the texture matters. The first texture will appear at the lowest height, and the last texture at the highest height. Use blend to control the height thresholds and fade between textures. Size of the applied texture in meters. Diffuse texture image filename Normalmap texture image filename The blend tag controls how two adjacent textures are mixed. The number of blend elements should equal one less than the number of textures. Min height of a blend layer Distance over which the blend occurs Set if the rendering engine will use terrain paging Samples per heightmap datum. For rasterized heightmaps, this indicates the number of samples to take per pixel. Using a lower value, e.g. 1, will generally improve the performance of the heightmap but lower the heightmap quality. Extrude a set of boxes from a grayscale image. URI of the grayscale image file Scaling factor applied to the image Grayscale threshold Height of the extruded boxes The amount of error in the model Mesh shape Mesh uri Use a named submesh. The submesh must exist in the mesh specified by the uri Name of the submesh within the parent mesh Set to true to center the vertices of the submesh at 0,0,0. This will effectively remove any transformations on the submesh before the poses from parent links and models are applied. Scaling factor applied to the mesh Plane shape Normal direction for the plane. When a Plane is used as a geometry for a Visual or Collision object, then the normal is specified in the Visual or Collision frame, respectively. Length of each side of the plane. Note that this property is meaningful only for visualizing the Plane, i.e., when the Plane is used as a geometry for a Visual object. The Plane has infinite size when used as a geometry for a Collision object. Defines an extruded polyline shape A series of points that define the path of the polyline. Height of the polyline Sphere shape radius of the sphere A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. The sensor tag describes the type and properties of a sensor. A unique name for the sensor. This name must not match another model in the model. The type name of the sensor. By default, SDFormat supports types air_pressure, altimeter, camera, contact, depth_camera, force_torque, gps, gpu_lidar, gpu_ray, imu, lidar, logical_camera, magnetometer, multicamera, ray, rfid, rfidtag, rgbd_camera, sonar, thermal_camera, wireless_receiver, and wireless_transmitter. The "ray" and "gpu_ray" types are equivalent to "lidar" and "gpu_lidar", respectively. It is preferred to use "lidar" and "gpu_lidar" since "ray" and "gpu_ray" will be deprecated. The "ray" and "gpu_ray" types are maintained for legacy support. If true the sensor will always be updated according to the update rate. The frequency at which the sensor data is generated. If left unspecified, the sensor will generate data every cycle. If true, the sensor is visualized in the GUI Name of the topic on which data is published. This is necessary for visualization A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. These elements are specific to an air pressure sensor. The initial altitude in meters. This value can be used by a sensor implementation to augment the altitude of the sensor. For example, if you are using simulation instead of creating a 1000 m mountain model on which to place your sensor, you could instead set this value to 1000 and place your model on a ground plane with a Z height of zero. Noise parameters for the pressure data. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to an altimeter sensor. Noise parameters for vertical position The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical velocity The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to camera sensors. An optional name for the camera. Horizontal field of view The image size in pixels and format. Width in pixels Height in pixels (L8|R8G8B8|B8G8R8|BAYER_RGGB8|BAYER_BGGR8|BAYER_GBRG8|BAYER_GRBG8) The near and far clip planes. Objects closer or farther than these planes are not rendered. Near clipping plane Far clipping plane Enable or disable saving of camera frames. True = saving enabled The path name which will hold the frame data. If path name is relative, then directory is relative to current working directory. Depth camera parameters Type of output The near and far clip planes. Objects closer or farther than these planes are not detected by the depth camera. Near clipping plane for depth camera Far clipping plane for depth camera The properties of the noise model that should be applied to generated images The type of noise. Currently supported types are: "gaussian" (draw additive noise values independently for each pixel from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. Lens distortion to be applied to camera images. See http://en.wikipedia.org/wiki/Distortion_(optics)#Software_correction The radial distortion coefficient k1 The radial distortion coefficient k2 The radial distortion coefficient k3 The tangential distortion coefficient p1 The tangential distortion coefficient p2 The distortion center or principal point Lens projection description Type of the lens mapping. Supported values are gnomonical, stereographic, equidistant, equisolid_angle, orthographic, custom. For gnomonical (perspective) projection, it is recommended to specify a horizontal_fov of less than or equal to 90° If true the image will be scaled to fit horizontal FOV, otherwise it will be shown according to projection type parameters Definition of custom mapping function in a form of r=c1*f*fun(theta/c2 + c3). See https://en.wikipedia.org/wiki/Fisheye_lens#Mapping_function Linear scaling constant Angle scaling constant Angle offset constant Focal length of the optical system. Note: It's not a focal length of the lens in a common sense! This value is ignored if 'scale_to_fov' is set to true Possible values are 'sin', 'tan' and 'id' Everything outside of the specified angle will be hidden, 90° by default Resolution of the environment cube map used to draw the world Camera intrinsic parameters for setting a custom perspective projection matrix (cannot be used with WideAngleCamera since this class uses image stitching from 6 different cameras for achieving a wide field of view). The focal lengths can be computed using focal_length_in_pixels = (image_width_in_pixels * 0.5) / tan(field_of_view_in_degrees * 0.5 * PI/180) X focal length (in pixels, overrides horizontal_fov) Y focal length (in pixels, overrides horizontal_fov) X principal point (in pixels) Y principal point (in pixels) XY axis skew A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. These elements are specific to the contact sensor. name of the collision element within a link that acts as the contact sensor. Topic on which contact data is published. These elements are specific to the GPS sensor. Parameters related to GPS position measurement. Noise parameters for horizontal position measurement, in units of meters. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical position measurement, in units of meters. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to GPS position measurement. Noise parameters for horizontal velocity measurement, in units of meters/second. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical velocity measurement, in units of meters/second. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the IMU sensor. This string represents special hardcoded use cases that are commonly seen with typical robot IMU's: - CUSTOM: use Euler angle custom_rpy orientation specification. The orientation of the IMU's reference frame is defined by adding the custom_rpy rotation to the parent_frame. - NED: The IMU XYZ aligns with NED, where NED orientation relative to Gazebo world is defined by the SphericalCoordinates class. - ENU: The IMU XYZ aligns with ENU, where ENU orientation relative to Gazebo world is defined by the SphericalCoordinates class. - NWU: The IMU XYZ aligns with NWU, where NWU orientation relative to Gazebo world is defined by the SphericalCoordinates class. - GRAV_UP: where direction of gravity maps to IMU reference frame Z-axis with Z-axis pointing in the opposite direction of gravity. IMU reference frame X-axis direction is defined by grav_dir_x. Note if grav_dir_x is parallel to gravity direction, this configuration fails. Otherwise, IMU reference frame X-axis is defined by projection of grav_dir_x onto a plane normal to the gravity vector. IMU reference frame Y-axis is a vector orthogonal to both X and Z axis following the right hand rule. - GRAV_DOWN: where direction of gravity maps to IMU reference frame Z-axis with Z-axis pointing in the direction of gravity. IMU reference frame X-axis direction is defined by grav_dir_x. Note if grav_dir_x is parallel to gravity direction, this configuration fails. Otherwise, IMU reference frame X-axis is defined by projection of grav_dir_x onto a plane normal to the gravity vector. IMU reference frame Y-axis is a vector orthogonal to both X and Z axis following the right hand rule. This field and parent_frame are used when localization is set to CUSTOM. Orientation (fixed axis roll, pitch yaw) transform from parent_frame to this IMU's reference frame. Some common examples are: - IMU reports in its local frame on boot. IMU sensor frame is the reference frame. Example: parent_frame="", custom_rpy="0 0 0" - IMU reports in Gazebo world frame. Example sdf: parent_frame="world", custom_rpy="0 0 0" - IMU reports in NWU frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between North-West-Up and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="0 0 0" - IMU reports in NED frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between North-East-Down and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="M_PI 0 0" - IMU reports in ENU frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between East-North-Up and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="0 0 -0.5*M_PI" - IMU reports in ROS optical frame as described in http://www.ros.org/reps/rep-0103.html#suffix-frames, which is (z-forward, x-left to right when facing +z, y-top to bottom when facing +z). (default gazebo camera is +x:view direction, +y:left, +z:up). Example sdf: parent_frame="local", custom_rpy="-0.5*M_PI 0 -0.5*M_PI" Name of parent frame which the custom_rpy transform is defined relative to. It can be any valid fully scoped Gazebo Link name or the special reserved "world" frame. If left empty, use the sensor's own local frame. Used when localization is set to GRAV_UP or GRAV_DOWN, a projection of this vector into a plane that is orthogonal to the gravity vector defines the direction of the IMU reference frame's X-axis. grav_dir_x is defined in the coordinate frame as defined by the parent_frame element. Name of parent frame in which the grav_dir_x vector is defined. It can be any valid fully scoped Gazebo Link name or the special reserved "world" frame. If left empty, use the sensor's own local frame. Topic on which data is published. DEPRECATED. Use the topic element that is a child of the sensor element. These elements are specific to body-frame angular velocity, which is expressed in radians per second Angular velocity about the X axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Angular velocity about the Y axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Angular velocity about the Z axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to body-frame linear acceleration, which is expressed in meters per second squared Linear acceleration about the X axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Linear acceleration about the Y axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Linear acceleration about the Z axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the lidar sensor. The number of simulated lidar rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle The number of simulated lidar rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle specifies range properties of each simulated lidar The minimum distance for each lidar ray. The maximum distance for each lidar ray. Linear resolution of each lidar ray. The properties of the noise model that should be applied to generated scans The type of noise. Currently supported types are: "gaussian" (draw noise values independently for each beam from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. These elements are specific to logical camera sensors. A logical camera reports objects that fall within a frustum. Computation should be performed on the CPU. Near clipping distance of the view frustum Far clipping distance of the view frustum Aspect ratio of the near and far planes. This is the width divided by the height of the near or far planes. Horizontal field of view of the frustum, in radians. This is the angle between the frustum's vertex and the edges of the near or far plane. These elements are specific to a Magnetometer sensor. Parameters related to the body-frame X axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to the body-frame Y axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to the body-frame Z axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the ray (laser) sensor. The number of simulated rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle The number of simulated rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle specifies range properties of each simulated ray The minimum distance for each ray. The maximum distance for each ray. Linear resolution of each ray. The properties of the noise model that should be applied to generated scans The type of noise. Currently supported types are: "gaussian" (draw noise values independently for each beam from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. These elements are specific to the sonar sensor. The sonar collision shape. Currently supported geometries are: "cone" and "sphere". Minimum range Max range Radius of the sonar cone at max range. This parameter is only used if geometry is "cone". These elements are specific to a wireless transceiver. Service set identifier (network name) Specifies the frequency of transmission in MHz Only a frequency range is filtered. Here we set the lower bound (MHz). Only a frequency range is filtered. Here we set the upper bound (MHz). Specifies the antenna gain in dBi Specifies the transmission power in dBm Mininum received signal power in dBm These elements are specific to the force torque sensor. Frame in which to report the wrench values. Currently supported frames are: "parent" report the wrench expressed in the orientation of the parent link frame, "child" report the wrench expressed in the orientation of the child link frame, "sensor" report the wrench expressed in the orientation of the joint sensor frame. Note that for each option the point with respect to which the torque component of the wrench is expressed is the joint origin. Direction of the wrench measured by the sensor. The supported options are: "parent_to_child" if the measured wrench is the one applied by parent link on the child link, "child_to_parent" if the measured wrench is the one applied by the child link on the parent link. Name of the projector Texture name Field of view Near clip distance far clip distance A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. An audio sink. An audio source. URI of the audio media. Pitch for the audio media, in Hz Gain for the audio media, in dB. List of collision objects that will trigger audio playback. Name of child collision element that will trigger audio playback. True to make the audio source loop playback. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. Description of a battery. Unique name for the battery. Initial voltage in volts. The light element describes a light source. A unique name for the light. The light type: point, directional, spot. When true, the light will cast shadows. Diffuse light color Specular light color Light attenuation Range of the light The linear attenuation factor: 1 means attenuate evenly over the distance. The constant attenuation factor: 1.0 means never attenuate, 0.0 is complete attenutation. The quadratic attenuation factor: adds a curvature to the attenuation. Direction of the light, only applicable for spot and directional lights. Spot light parameters Angle covered by the bright inner cone Angle covered by the outer cone The rate of falloff between the inner and outer cones. 1.0 means a linear falloff, less means slower falloff, higher means faster falloff. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A joint connects two links with kinematic and dynamic properties. By default, the pose of a joint is expressed in the child link frame. A unique name for the joint within the scope of the model. The type of joint, which must be one of the following: (continuous) a hinge joint that rotates on a single axis with a continuous range of motion, (revolute) a hinge joint that rotates on a single axis with a fixed range of motion, (gearbox) geared revolute joints, (revolute2) same as two revolute joints connected in series, (prismatic) a sliding joint that slides along an axis with a limited range specified by upper and lower limits, (ball) a ball and socket joint, (screw) a single degree of freedom joint with coupled sliding and rotational motion, (universal) like a ball joint, but constrains one degree of freedom, (fixed) a joint with zero degrees of freedom that rigidly connects two links. Name of the parent link Name of the child link Parameter for gearbox joints. Given theta_1 and theta_2 defined in description for gearbox_reference_body, theta_2 = -gearbox_ratio * theta_1. Parameter for gearbox joints. Gearbox ratio is enforced over two joint angles. First joint angle (theta_1) is the angle from the gearbox_reference_body to the parent link in the direction of the axis element and the second joint angle (theta_2) is the angle from the gearbox_reference_body to the child link in the direction of the axis2 element. Parameter for screw joints. Parameters related to the axis of rotation for revolute joints, the axis of translation for prismatic joints. Default joint position for this joint axis. Represents the x,y,z components of the axis unit vector. The axis is expressed in the joint frame unless the use_parent_model_frame flag is set to true. The vector should be normalized. Flag to interpret the axis xyz element in the parent model frame instead of joint frame. Provided for Gazebo compatibility (see https://bitbucket.org/osrf/gazebo/issue/494 ). An element specifying physical properties of the joint. These values are used to specify modeling properties of the joint, particularly useful for simulation. The physical velocity dependent viscous damping coefficient of the joint. The physical static friction value of the joint. The spring reference position for this joint axis. The spring stiffness for this joint axis. specifies the limits of this joint Specifies the lower joint limit (radians for revolute joints, meters for prismatic joints). Omit if joint is continuous. Specifies the upper joint limit (radians for revolute joints, meters for prismatic joints). Omit if joint is continuous. A value for enforcing the maximum joint effort applied. Limit is not enforced if value is negative. A value for enforcing the maximum joint velocity. Joint stop stiffness. Joint stop dissipation. Parameters related to the second axis of rotation for revolute2 joints and universal joints. Default joint position for this joint axis. Represents the x,y,z components of the axis unit vector. The axis is expressed in the joint frame unless the use_parent_model_frame flag is set to true. The vector should be normalized. Flag to interpret the axis xyz element in the parent model frame instead of joint frame. Provided for Gazebo compatibility (see https://bitbucket.org/osrf/gazebo/issue/494 ). An element specifying physical properties of the joint. These values are used to specify modeling properties of the joint, particularly useful for simulation. The physical velocity dependent viscous damping coefficient of the joint. EXPERIMENTAL: if damping coefficient is negative and implicit_spring_damper is true, adaptive damping is used. The physical static friction value of the joint. The spring reference position for this joint axis. The spring stiffness for this joint axis. An attribute specifying the lower joint limit (radians for revolute joints, meters for prismatic joints). Omit if joint is continuous. An attribute specifying the upper joint limit (radians for revolute joints, meters for prismatic joints). Omit if joint is continuous. An attribute for enforcing the maximum joint effort applied by Joint::SetForce. Limit is not enforced if value is negative. (not implemented) An attribute for enforcing the maximum joint velocity. Joint stop stiffness. Supported physics engines: SimBody. Joint stop dissipation. Supported physics engines: SimBody. Parameters that are specific to a certain physics engine. Simbody specific parameters Force cut in the multibody graph at this joint. ODE specific parameters (DEPRECATION WARNING: In SDFormat 1.5 this tag will be replaced by the same tag directly under the physics-block. For now, this tag overrides the one outside of ode-block, but in SDFormat 1.5 this tag will be removed completely.) If provide feedback is set to true, ODE will compute the constraint forces at this joint. If cfm damping is set to true, ODE will use CFM to simulate damping, allows for infinite damping, and one additional constraint row (previously used for joint limit) is always active. If implicit_spring_damper is set to true, ODE will use CFM, ERP to simulate stiffness and damping, allows for infinite damping, and one additional constraint row (previously used for joint limit) is always active. This replaces cfm_damping parameter in SDFormat 1.4. Scale the excess for in a joint motor at joint limits. Should be between zero and one. Constraint force mixing for constrained directions Error reduction parameter for constrained directions Bounciness of the limits Maximum force or torque used to reach the desired velocity. The desired velocity of the joint. Should only be set if you want the joint to move on load. Constraint force mixing parameter used by the joint stop Error reduction parameter used by the joint stop Suspension constraint force mixing parameter Suspension error reduction parameter If provide feedback is set to true, physics engine will compute the constraint forces at this joint. For now, provide_feedback under ode block will override this tag and given user warning about the migration. provide_feedback under ode is scheduled to be removed in SDFormat 1.5. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. The sensor tag describes the type and properties of a sensor. A unique name for the sensor. This name must not match another model in the model. The type name of the sensor. By default, SDFormat supports types air_pressure, altimeter, camera, contact, depth_camera, force_torque, gps, gpu_lidar, gpu_ray, imu, lidar, logical_camera, magnetometer, multicamera, ray, rfid, rfidtag, rgbd_camera, sonar, thermal_camera, wireless_receiver, and wireless_transmitter. The "ray" and "gpu_ray" types are equivalent to "lidar" and "gpu_lidar", respectively. It is preferred to use "lidar" and "gpu_lidar" since "ray" and "gpu_ray" will be deprecated. The "ray" and "gpu_ray" types are maintained for legacy support. If true the sensor will always be updated according to the update rate. The frequency at which the sensor data is generated. If left unspecified, the sensor will generate data every cycle. If true, the sensor is visualized in the GUI Name of the topic on which data is published. This is necessary for visualization A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. These elements are specific to an air pressure sensor. The initial altitude in meters. This value can be used by a sensor implementation to augment the altitude of the sensor. For example, if you are using simulation instead of creating a 1000 m mountain model on which to place your sensor, you could instead set this value to 1000 and place your model on a ground plane with a Z height of zero. Noise parameters for the pressure data. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to an altimeter sensor. Noise parameters for vertical position The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical velocity The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to camera sensors. An optional name for the camera. Horizontal field of view The image size in pixels and format. Width in pixels Height in pixels (L8|R8G8B8|B8G8R8|BAYER_RGGB8|BAYER_BGGR8|BAYER_GBRG8|BAYER_GRBG8) The near and far clip planes. Objects closer or farther than these planes are not rendered. Near clipping plane Far clipping plane Enable or disable saving of camera frames. True = saving enabled The path name which will hold the frame data. If path name is relative, then directory is relative to current working directory. Depth camera parameters Type of output The near and far clip planes. Objects closer or farther than these planes are not detected by the depth camera. Near clipping plane for depth camera Far clipping plane for depth camera The properties of the noise model that should be applied to generated images The type of noise. Currently supported types are: "gaussian" (draw additive noise values independently for each pixel from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. Lens distortion to be applied to camera images. See http://en.wikipedia.org/wiki/Distortion_(optics)#Software_correction The radial distortion coefficient k1 The radial distortion coefficient k2 The radial distortion coefficient k3 The tangential distortion coefficient p1 The tangential distortion coefficient p2 The distortion center or principal point Lens projection description Type of the lens mapping. Supported values are gnomonical, stereographic, equidistant, equisolid_angle, orthographic, custom. For gnomonical (perspective) projection, it is recommended to specify a horizontal_fov of less than or equal to 90° If true the image will be scaled to fit horizontal FOV, otherwise it will be shown according to projection type parameters Definition of custom mapping function in a form of r=c1*f*fun(theta/c2 + c3). See https://en.wikipedia.org/wiki/Fisheye_lens#Mapping_function Linear scaling constant Angle scaling constant Angle offset constant Focal length of the optical system. Note: It's not a focal length of the lens in a common sense! This value is ignored if 'scale_to_fov' is set to true Possible values are 'sin', 'tan' and 'id' Everything outside of the specified angle will be hidden, 90° by default Resolution of the environment cube map used to draw the world Camera intrinsic parameters for setting a custom perspective projection matrix (cannot be used with WideAngleCamera since this class uses image stitching from 6 different cameras for achieving a wide field of view). The focal lengths can be computed using focal_length_in_pixels = (image_width_in_pixels * 0.5) / tan(field_of_view_in_degrees * 0.5 * PI/180) X focal length (in pixels, overrides horizontal_fov) Y focal length (in pixels, overrides horizontal_fov) X principal point (in pixels) Y principal point (in pixels) XY axis skew A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. These elements are specific to the contact sensor. name of the collision element within a link that acts as the contact sensor. Topic on which contact data is published. These elements are specific to the GPS sensor. Parameters related to GPS position measurement. Noise parameters for horizontal position measurement, in units of meters. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical position measurement, in units of meters. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to GPS position measurement. Noise parameters for horizontal velocity measurement, in units of meters/second. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Noise parameters for vertical velocity measurement, in units of meters/second. The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the IMU sensor. This string represents special hardcoded use cases that are commonly seen with typical robot IMU's: - CUSTOM: use Euler angle custom_rpy orientation specification. The orientation of the IMU's reference frame is defined by adding the custom_rpy rotation to the parent_frame. - NED: The IMU XYZ aligns with NED, where NED orientation relative to Gazebo world is defined by the SphericalCoordinates class. - ENU: The IMU XYZ aligns with ENU, where ENU orientation relative to Gazebo world is defined by the SphericalCoordinates class. - NWU: The IMU XYZ aligns with NWU, where NWU orientation relative to Gazebo world is defined by the SphericalCoordinates class. - GRAV_UP: where direction of gravity maps to IMU reference frame Z-axis with Z-axis pointing in the opposite direction of gravity. IMU reference frame X-axis direction is defined by grav_dir_x. Note if grav_dir_x is parallel to gravity direction, this configuration fails. Otherwise, IMU reference frame X-axis is defined by projection of grav_dir_x onto a plane normal to the gravity vector. IMU reference frame Y-axis is a vector orthogonal to both X and Z axis following the right hand rule. - GRAV_DOWN: where direction of gravity maps to IMU reference frame Z-axis with Z-axis pointing in the direction of gravity. IMU reference frame X-axis direction is defined by grav_dir_x. Note if grav_dir_x is parallel to gravity direction, this configuration fails. Otherwise, IMU reference frame X-axis is defined by projection of grav_dir_x onto a plane normal to the gravity vector. IMU reference frame Y-axis is a vector orthogonal to both X and Z axis following the right hand rule. This field and parent_frame are used when localization is set to CUSTOM. Orientation (fixed axis roll, pitch yaw) transform from parent_frame to this IMU's reference frame. Some common examples are: - IMU reports in its local frame on boot. IMU sensor frame is the reference frame. Example: parent_frame="", custom_rpy="0 0 0" - IMU reports in Gazebo world frame. Example sdf: parent_frame="world", custom_rpy="0 0 0" - IMU reports in NWU frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between North-West-Up and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="0 0 0" - IMU reports in NED frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between North-East-Down and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="M_PI 0 0" - IMU reports in ENU frame. Uses SphericalCoordinates class to determine world frame in relation to magnetic north and gravity; i.e. rotation between East-North-Up and world (+X,+Y,+Z) frame is defined by SphericalCoordinates class. Example sdf given world is NWU: parent_frame="world", custom_rpy="0 0 -0.5*M_PI" - IMU reports in ROS optical frame as described in http://www.ros.org/reps/rep-0103.html#suffix-frames, which is (z-forward, x-left to right when facing +z, y-top to bottom when facing +z). (default gazebo camera is +x:view direction, +y:left, +z:up). Example sdf: parent_frame="local", custom_rpy="-0.5*M_PI 0 -0.5*M_PI" Name of parent frame which the custom_rpy transform is defined relative to. It can be any valid fully scoped Gazebo Link name or the special reserved "world" frame. If left empty, use the sensor's own local frame. Used when localization is set to GRAV_UP or GRAV_DOWN, a projection of this vector into a plane that is orthogonal to the gravity vector defines the direction of the IMU reference frame's X-axis. grav_dir_x is defined in the coordinate frame as defined by the parent_frame element. Name of parent frame in which the grav_dir_x vector is defined. It can be any valid fully scoped Gazebo Link name or the special reserved "world" frame. If left empty, use the sensor's own local frame. Topic on which data is published. DEPRECATED. Use the topic element that is a child of the sensor element. These elements are specific to body-frame angular velocity, which is expressed in radians per second Angular velocity about the X axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Angular velocity about the Y axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Angular velocity about the Z axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to body-frame linear acceleration, which is expressed in meters per second squared Linear acceleration about the X axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Linear acceleration about the Y axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Linear acceleration about the Z axis The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the lidar sensor. The number of simulated lidar rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle The number of simulated lidar rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle specifies range properties of each simulated lidar The minimum distance for each lidar ray. The maximum distance for each lidar ray. Linear resolution of each lidar ray. The properties of the noise model that should be applied to generated scans The type of noise. Currently supported types are: "gaussian" (draw noise values independently for each beam from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. These elements are specific to logical camera sensors. A logical camera reports objects that fall within a frustum. Computation should be performed on the CPU. Near clipping distance of the view frustum Far clipping distance of the view frustum Aspect ratio of the near and far planes. This is the width divided by the height of the near or far planes. Horizontal field of view of the frustum, in radians. This is the angle between the frustum's vertex and the edges of the near or far plane. These elements are specific to a Magnetometer sensor. Parameters related to the body-frame X axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to the body-frame Y axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. Parameters related to the body-frame Z axis of the magnetometer The properties of a sensor noise model. The type of noise. Currently supported types are: "none" (no noise). "gaussian" (draw noise values independently for each measurement from a Gaussian distribution). "gaussian_quantized" ("gaussian" plus quantization of outputs (ie. rounding)) For type "gaussian*", the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which noise values are drawn. For type "gaussian*", the mean of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the Gaussian distribution from which bias values are drawn. For type "gaussian*", the standard deviation of the noise used to drive a process to model slow variations in a sensor bias. For type "gaussian*", the correlation time in seconds of the noise used to drive a process to model slow variations in a sensor bias. A typical value, when used, would be on the order of 3600 seconds (1 hour). For type "gaussian_quantized", the precision of output signals. A value of zero implies infinite precision / no quantization. These elements are specific to the ray (laser) sensor. The number of simulated rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle The number of simulated rays to generate per complete laser sweep cycle. This number is multiplied by samples to determine the number of range data points returned. If resolution is less than one, range data is interpolated. If resolution is greater than one, range data is averaged. Must be greater or equal to min_angle specifies range properties of each simulated ray The minimum distance for each ray. The maximum distance for each ray. Linear resolution of each ray. The properties of the noise model that should be applied to generated scans The type of noise. Currently supported types are: "gaussian" (draw noise values independently for each beam from a Gaussian distribution). For type "gaussian," the mean of the Gaussian distribution from which noise values are drawn. For type "gaussian," the standard deviation of the Gaussian distribution from which noise values are drawn. These elements are specific to the sonar sensor. The sonar collision shape. Currently supported geometries are: "cone" and "sphere". Minimum range Max range Radius of the sonar cone at max range. This parameter is only used if geometry is "cone". These elements are specific to a wireless transceiver. Service set identifier (network name) Specifies the frequency of transmission in MHz Only a frequency range is filtered. Here we set the lower bound (MHz). Only a frequency range is filtered. Here we set the upper bound (MHz). Specifies the antenna gain in dBi Specifies the transmission power in dBm Mininum received signal power in dBm These elements are specific to the force torque sensor. Frame in which to report the wrench values. Currently supported frames are: "parent" report the wrench expressed in the orientation of the parent link frame, "child" report the wrench expressed in the orientation of the child link frame, "sensor" report the wrench expressed in the orientation of the joint sensor frame. Note that for each option the point with respect to which the torque component of the wrench is expressed is the joint origin. Direction of the wrench measured by the sensor. The supported options are: "parent_to_child" if the measured wrench is the one applied by parent link on the child link, "child_to_parent" if the measured wrench is the one applied by the child link on the parent link. A plugin is a dynamically loaded chunk of code. It can exist as a child of world, model, and sensor. A unique name for the plugin, scoped to its parent. Name of the shared library to load. If the filename is not a full path name, the file will be searched for in the configuration paths. The light element describes a light source. A unique name for the light. The light type: point, directional, spot. When true, the light will cast shadows. Diffuse light color Specular light color Light attenuation Range of the light The linear attenuation factor: 1 means attenuate evenly over the distance. The constant attenuation factor: 1.0 means never attenuate, 0.0 is complete attenutation. The quadratic attenuation factor: adds a curvature to the attenuation. Direction of the light, only applicable for spot and directional lights. Spot light parameters Angle covered by the bright inner cone Angle covered by the outer cone The rate of falloff between the inner and outer cones. 1.0 means a linear falloff, less means slower falloff, higher means faster falloff. A frame of reference to which a pose is relative. Name of the frame. This name must not match another frame defined inside the parent that this frame is attached to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to. A position(x,y,z) and orientation(roll, pitch yaw) with respect to the specified frame. Name of frame which the pose is defined relative to.