STAR ARMY

Sci-fi roleplaying and worldbuilding community

User Tools

Site Tools


wip_2023_or_older:stararmy:manuals:test_and_evaluation_handbook

Test And Evaluation Handbook

Forward

The 1st Rikugun Test and Evaluation Battalion is tasked with acquiring and testing/evaluating products that might be of interest to the Star Army Rikugun. This guide is a quick primer to their processes.

Test and Eval Loop

At the highest level, they use the Test and Evaluation Loop to guide our processes. The process is referred to as a loop because at any stage, a product might be moved back to a previous Process Phase. Additionally, the1st RTE Bn retains partial responsibility for the solution up to the end of its first 5-year re-evaluation cycle.

  1. Discovery Phase
    1. The 1st RTE Bn is regularly surveying service members, reading reports pertaining to, and directly testing currently in service Rikugun equipment, doctrine, and vulnerabilities. This sort of analysis allows them to identify potential needs.
  2. Test Phase
    1. A variety of solutions are considered including doctrinal shifts and technological acquisitions. If a piece of technology is required they are authorized to acquire it in limited numbers. They begin to put the product through its paces designing experiments to test a variety of hard and soft factors.
  3. Acquisition Phase
    1. Once one or more potential solutions are identified, they consult with Star Army Research Administration, Star Army Logistics, and Star Army Doctrine Administration to study acquiring the solution in sufficient numbers.
  4. Implementation Phase
    1. At this point in the T&E Loop, the process is handed over to Logistics and the Doctrine Administration to oversee implementation. Advisors from 1st RTE Bn are often included in the process to ensure the processes they developed or weaknesses they identified are transmitted to the units using the product.
  5. Evaluation Phase
    1. New equipment is rarely rolled out all at once. Instead select units are asked to test the new product in field conditions. Often 1st RTE Bn personnel are attached to these units or where available test the products themselves. As the product is rolled out in larger numbers user experience and product effectiveness is gauged through combat data, surveys, and reports written by trained testers. Once a solution is fielded it is monitored to ensure that it meets the identified need sufficiently and that it does not create new needs. This process helps shake out bugs.

Test and Eval Process

Considerations

It is easy to make the mistake of simply comparing the on paper statistics of two systems (weight, firepower, armor, etc) and choosing the “best.” While this logic might work in video games, it rarely holds true in real life. The Test and Evaluations Process is much more complex and requires both strong analytical skills and creativity. Easily quantifiable factors such as fuel consumption, unit price, hard factors, are an important starting place, but only tell part of the story. To truly understand a product or system, the 1st RTE Bn trains its evaluators to analyze soft factors. Soft factors are less tangible or less easily quantified factors related to a system’s evaluation. These are often end user experience, human factors considerations, or maintenance and logistical burdens. Hard and soft factors are not completely separate. In fact, certain soft factors contribute to hard factors and visa versa. When comparing hard and soft factors, there is no formula or perfect balance that can be achieved.

The subsequent list of hard and soft factors is non-exhaustive. Evaluators are encouraged to use these factors to expand their analytical imagination.

Hard Factors

Hard factors tend to be easily calculated. Some examples are as follows:

  • Ammunition counts
  • Range
  • Accuracy
  • Armor thickness
  • Speed
  • Weapon caliber
  • Warhead size
  • Engine output
  • Radar signature
  • Cost

Soft Factors

Soft factors can be a somewhat nebulous term. A good Test and Evaluation Officer learns methods of operationalizes and quantifying more nebulous concepts. Some soft factors verge on hard factors but are still not the sort of considerations listed on an average specification sheet. Because soft factors are not always self explanatory, this section will involve more commentary.

  • Ergonomics
    • This is an oft overlooked category. Gear that works with the user as opposed to against them can bring out the best of a crew. Examples of good ergonomics might include: cleanly presented information, space to easily perform their crew function without having to bunch up, wait for somebody else to move, duck, whatever, intuitive controls, economy of movement, measures to reduce crew stress. (proper ventilation, sound proofing, insulation, eye strain reduction, smoother rides, responsive controls)
  • Costs associated with not having the equipment
    • For example, costs of training new soldiers and paying death benefits vs the cost of issuing IFAKs and training soldiers in combat care.
  • Maintenance requirements
    • Do we need maintenance between every mission or after every few missions?
    • A rifle that never misses but can only shoot in perfect conditions makes a worse infantry rifle than one that is not quite as accurate but can be used reliably in the face of adverse battlefield conditions.1)
  • Urgency
    • Sometimes an OK fix today is better than a perfect fix tomorrow
  • Training requirements
  • Production capability? (Can the supplier supply the product in the numbers we need)
  • Situational Awareness
  • Ease of entry and exit
    • Especially important for things like armored vehicles where crews might need to disembark under fire or escape a damaged vehicle.

Principles

There isn’t a weapon system that can’t malfunction, nor is there impenetrable armor. A failure during testing doesn’t always constitute a mark against a product as much as helps us outline the product's uses and capabilities.

The Lesson of the Liger tank

Right system for the Us. Not the best system period. How is this system intended to be used? Ideal conditions? Less than ideal conditions? What is the level of skill of the average operator? How many of this system do we need? Will the maintenance requirements for this system require us to shift our doctrine in ways we do not intend to?

Understanding the Users

No piece of equipment exists alone on the battlefield. How does it fit into the larger picture? A fighter could be the most maneuverable craft and able to out dogfight anything else on the market, but what good is that if it can’t talk to other crafts or reliably find and identify targets? The most protective power armor bristling with aether cannons is completely useless if spare parts can’t be sourced or maintenance requirements keep them grounded more often than not.

Simply put, while end user experience is important, that user exists as part of a network as opposed to at the end of a chain. A vehicle operator might be the end user for an armored vehicle, but their positive experience with the system has to be weighed against logistical and maintenance burden as well as the cost of operation. Additionally, a decision maker must consider how well the armored vehicle operates along side infantry. Often times, a system is beloved by one aspect of this web but hated by the others. Decision makers must way the pros and cons and determine if the benefits out way the drawbacks.

Good kit with bad doctrine is bad kit.

Methods

OOC Notes

Locked 0ut created this article on 2022/11/14 07:08.

🚧 This article is a work-in-progress. Is it not currently approved.

1)
Mud, dust, water, impacts, and general mistreatment

Quality:
wip_2023_or_older/stararmy/manuals/test_and_evaluation_handbook.txt · Last modified: 2023/12/27 13:54 by wes