Download the 2014 Federal Plain Language Report Card: Report Card pdf file | High Resolution image | White paper

How did grading work in 2014?

Each year, the Center for Plain Language evaluates how effectively federal departments comply with letter and the spirit of the Plain Writing Act of 2010. In 2014, each department received 3 grades:

  • Compliance
  • Writing
  • Information Design

Compliance

Compliance scores whether agencies fulfill the requirements of the Plain Language Act of 2010. Agencies get points for:

  • Having a webpage that describes the agency’s plain language efforts
  • Providing a feedback channel for people to complain about documents that are hard to understand or praise documents that are written clearly
  • Responding to feedback in a timely matter (including our request for documents)
  • Having a link to the plain language webpage on the Agency homepage
  • Publishing the agency’s Plain Language Plan
  • Naming the person in charge of the plain language program
  • Publishing an annual report describing what they’ve done
  • Training staff to write in plain language

Writing

Writing scores were based on whether the documents demonstrated effective use of plain language principles, such as

  • Does the writer limit their use of passive and hidden verbs?
  • Does the writer use common words and avoid or define jargon?
  • Is the content direct and concise or wordy?
  • Is the narrative cohesive?
  • Is spelling, grammar, style and terminology correct?

Information design

Information design scores were based on consistent use of information design to guide readers’ attention and reinforce key messages:

  • Does the designer use variations in typography (such as bold, or increased size) to help readers organize and understand the content?
  • Does the writer use whitespace to effectively separate and highlight content, guiding the reader’s attention to and through the document?
  • Does the designer use color to guide the reader’s attention to important information?
  • Does the designer choose pictures, chart, or graphics that reinforce and extend the written content? Or were images included to add “visual interest”? If there were no images or graphics, would adding them help the reader understand?

We used automated tools (i.e., Microsoft Word, Acrolinx) to evaluate basic writing characteristics. Volunteers scored design elements that software can’t handle.

How were agencies graded?

The Center worked with Usability.org to design and implement the Report Card analysis. Each department reported about their Plain Language program by completing an on-line survey. They were asked to

  • Report information showing compliance with the specific requirements of the Plain Writing Act (for instance, the URL for their Plain Writing website)
  • Identify documents or webpages that could be analyzed for writing and information design
  • Describe if and how the writing samples were tested
  • Describe challenges and wins within of their plain language programs

Center board members presented a Federal PLAIN Network workshop to help Departments understand the requirements and prepare their materials.

Two departments, Interior and State, did not report.

Read our white paper about how agencies were graded: Who made the Grade? 2014 Federal Plain Language Report Card.

Honor roll or detention?

Agencies that did well got a commendation letter from Congressman Dave Loebsack. Agencies that did poorly received strong encouragement to do better next year.

Want to learn more?

Appendix: Federal Plain Language Websites & Senior Officials

Get involved

Analysis for the Report Card happens between April and October. To get involved, contact Kath Straub.

MENU