Jump to content

[blackshark] Ia Aérea


amalahama

Recommended Posts

Por si a alguien se le ha pasado, hay un nuevo texto, esta vez hablando del modelado de la IA aérea en BlackShark, y de la forma en la que ahora reconocerán los objetivos:

 

DCS: Black Shark Target Detection Model for AI Aircraft

 

DCS: Black Shark features new target assignment functionality in the mission editor and a complex target detection model for AI fixed and rotary wing aircraft. The model accounts for specific onboard sensors and environmental conditions. It is primarily designed to provide more realistic AI air-to-ground operations and includes the following variables:

 

* Unit onboard sensors

o air-to-air radar

o air-to-surface radar

o multimode radar

o radar warning receiver

o television optics

o low-light television optics

o imaging infrared optics

o infrared search and track system

* Unit skill

* Unit speed

* Unit cockpit view limits

* Time of day

* Target background (open terrain, forest, water, road, etc.)

* Atmospheric conditions

o fog

o overcast

o precipitation

* Line of sight obstruction (terrain, structures)

* Target size (including dust tail when moving over unpaved surfaces)

* Single vs. group targets

* Clutter (buildings/structures)

* Weapon firing

* Artificial illumination (flare, ground fire, etc.)

* Target light source (lights, beacons, etc.)

 

When assigning an attack waypoint, the mission designer designates a specific user-defined area on the map and target type(s) for the attacking aircraft to target (vehicles, buildings, aircraft, etc.). Upon reaching the attack waypoint, the AI begins to search for the desired target type(s) in the designated area until it either finds a target or reaches the next waypoint. The designated area can be minimized to strike a single coordinate (‘point’) on the map. The mission designer can assign the AI to conduct level (“carpet”) bombing by targeting two map points under the same name and identifying the initial and final target ‘points’.

 

The various radars and optical sights used by AI aircraft are defined individually for each unit according to their characteristics. This provides modeling of a wide range of combat capabilities of the modeled aircraft, depending on time of day and weather conditions.

 

Aircraft with no optical sensors or radar, such as the Su-25, will detect targets only when visual contact is possible. Detection depends on the horizontal and vertical cockpit view limits and on various environmental conditions, including time of day, weather, line of sight and other variables. At night, visual detection can be aided with artificial illumination or target firing activity. In bad weather, visual detection may be impossible.

 

Aircraft with television or low-light television optics will rely on natural or artificial illumination to detect targets. Reduction of natural light, fog, cloud cover, and heavy precipitation will reduce and ultimately eliminate the effectiveness of such systems.

 

Imaging infrared devices will allow aircraft to detect targets at night and in bad weather.

 

Aircraft equipped with radar are able to detect air and ground targets in any time of day and weather conditions. For air-to-surface radars, targets masked in ground clutter, such as city blocks, may be difficult or impossible to detect. However, unit movement, large units groups, and unit location on roads or runways will enhance radar detection. Certain radar models are restricted to detect only static structures or maritime targets (bombers and naval reconnaissance aircraft, respectively).

 

The unit skill and airspeed also affect the likelihood, range and time required for detection. In general, higher skill settings, lower airspeed and greater cockpit visibility increase the chances and range of both visual and sensor-based detection.

 

Numerous target characteristics are also calculated in the model. The target’s general size will determine maximum detection range for visual and sensor contact. A target in a group of units is more likely to be detected further than a target that is isolated. A moving target’s dust tail, including that of low flying aircraft, increases detection range (the dust tail is not present during heavy precipitation). Finally, any weapon firing contributes to fast detection. In general, more visible weapon types, such as MLRS barrages, will be detected further than less visible types, such as machine gun fire. Weapon fire can be detected either when the firing source is in the AI field of view or if the weapon trajectory enters the AI field of view. For example, the AI can detect and react to a SAM launch from the opposite side of a mountain, if the missile enters the AI’s field of view in flight.

 

All infantry units, including MANPAD troops, can be detected only after firing their weapons.

 

When engaging with cannon, TV, IR or SAL-guided weapons, the AI will commit to an attack only when visual or optical detection is possible. Target contact is required for weapon release. If target contact is lost before weapon release, the attack can continue only with unguided munitions. For example, if a target is detected due to weapon firing or under artificial illumination, but illumination or fire ceases before the attacked is complete, the AI can continue the engagement of a single target for up to 10 seconds with unguided rockets or bombs, aiming for the last visible target location.

 

Anti-ship cruise missiles can only be employed against targets detected with attack sensors, while surface attack cruise missiles can be employed either against detected targets or any map coordinate (‘point’).

 

When set to attack a ‘point’, as opposed to an object, any weapon can be used in any time of day and weather conditions.

 

When engaging air-to-air, sensor or visual contact must be possible for the AI to commit to an attack with cannon and IR-guided missiles. Target contact is required for weapon release. Radar contact is required for use of SARH and ARH missiles.

 

The DCS player can adjust the maximum detection ranges of the various units and individual sensor parameters in the program configuration files outside the simulation.

 

Saludos!!!

Link to comment
Share on other sites

es de agradecer que mejoren el comportamiento de la IA por que en el LOMAC no puedes confiar mucho en ellos y son capaces de hacer las tonterias mas grandes que te puedas imaginar. en a-a mas o menos pero en cuanto a las unidades de tierra y los a-g te puedes esperar cualquier cosa.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

Some pretty cookies are used in this website