Assessment tests are often required but results depend on the writing style and preferences of the test writer. Or the guide they use. CMOS? AP? MLA? Importantly, will the tester tell you the guide used for developing the answers? Or is it, again, the writer’s own preferences?
I stand on my published writing before any assessment test. Do you like my writing or not? That’s fair, isn’t it? No problem if you don’t. Besides my published writing, and the last five years I’ve spent professionally writing and editing, I have 851 posts at my personal blog. and I have a 60,000 word MS on this page. See anything you like? Yet, I recently had to take an assessment test.
This was a timed test on grammar problems. Eleven minutes. This lack of time kept me from looking up solutions to these difficult, unusual thought experiments.
Editors like myself rarely spend time researching usage, we just recast a sentence. That’s far quicker than musing over conflicting opinions and guides. In this test, however, we were supposed to dawdle on things like whether “womens” or “women’s” was correct in a particular sentence. Good grief. Just revise. And get on with the rest of your work.
Their first question probably revolved around a semi-colon. I don’t use them in business writing since they slow things down and make things less direct. Although I admire how Melville and Tolstoi used them to string together 150 word sentences with six tangents. Want to hear something shocking?
I sometimes substitute a comma when a semi-colon is called for.
That’s when one of our writers pens an otherwise well-crafted or intriguing sentence which only needs to move faster. Jazz would have never developed if musicians always stuck to playing the correct notes. More eccentricities? I don’t use dashes, parentheses, or italics. I let our writers use them but only to a certain extent. But back to the test.
The test writer consistently used “of been” instead of “have been.” That’s a style question, not a grammar problem.
Question seven moved me on before I was done. Technical problem.
Question eight referred to Catalan. Odd. I thought the area Catalonia, like in Orwell’s Homage to Catalonia. The test writer then called the people catalans, not capitalizing the “c”. That’s like calling a Californian a californian. Yet there was no way to point out this mistake, instead, the grammar problem in this question was about something else. I’m being graded by a writer who can’t capitalize?
Question ten had problems in the text underlined and noted as “A” through “K.” “Select which ones have a problem.” “K” had a problem but there was no “K “radio button to click. The button list only ran through “J.” Another technical problem.
Five professional editors would grade differently with this test. Does CMOS, MLA, or AP agree on everything? Of course not. And if you are working for a publication that has its own style sheet, well, it may not agree with any of these guides.
For comparison, the writing test I took five years ago for InFocus required twelve hours to complete. They paid me for my time and I wrote a number of papers on subjects they chose. An extended essay test. I’m still working for this honest and professional company.
An eleven minute assessment test is better suited to math or other fields whose problems have definite answers. In writing there are grammar mistakes that all can agree on but there are also thousands of instances in which writers or editors will disagree. There’s an art to English that an assessment test cannot assess.