The development of artificial intelligence applications in the law field is not all about robot lawyers and applications that replicate legal roles.
One of those sitting at the forefront of AI developments in modern law practices has been speech recognition (SR) software applications, which is a field that has become more crowded since Google and Microsoft entered the fray with their dictation and SR apps.
There is little doubt that there are major leaps in the ‘smarts’ developed in speech recognition and Google’s entry, together with Microsoft and the long-standing major player in the market – Nuance’s Dragon – making major advances in its technology with a near perfect recognition of voice and legal terminology, particularly with the introduction into New Zealand of its latest, Legal edition.
Dragon is the only paid-for service and is also the most popular with over 22 million worldwide registered users.
As a paid programme owners Nuance have invested heavily in ensuring there is a major value payoff beyond just accuracy and ease of use, as they also provided major added factors including the ability to control the computer, transcribe audio files and develop and manage a custom vocabulary, which is particularly helpful with any profession like the law that abounds with technical terms….all within a NZ context.
Dragon has long produced one of the law profession’s favourite SR programmes, but the new competitors create an environment that demanded a comprehensive test and LawFuel wanted to find a fair, impartial comparison with the new entrants and an objective review established a clear speech recogntion winner.
So What is Best?
Deciding which SR system or software is best can be a vexed issue for those deciding what to do. There are some key factors required to make any test worthwhile, including these ….
Accuracy – there is no point in a SR application that is inaccurate, requiring time-intensive correction. The most effective test of the different applications is to provide the same test, and we located a recent test where the four major programmes were tested on exactly the same 300 words, using the same microphone.
Testing was verified based upon misspellings, words that had been missed, correct words and punctuation errors.
Ease of Use – Another key usability factor. Having a tool that can be readily picked up and deployed is a key advantage in any programme. As SR programmes are (supposed to be) hands free, knowing how to work each one and navigate the features with voice alone is a key determinant of usability.
So navigation by voice alone, reducing or eliminating mouse use and keyboard, is an important factor in assessing how good a programme is.
Who Came Out on Top?
Dragon Legal NZ won by a considerable margin, being the most accurate, the most feature-filled and the ‘hands down’ winner for the legal market.
The review was conducted by BusinessNewsDaily in August 2018 and was able to pitch the latest entrants into the SR market to put them through their road test.
The verdict:
- Windows Speech Recognition
The transcription was described as “fairly accurate”, combined with some easy to learn additional instruction, using a tutorial much like Dragon, acting as a voice calibrator where the user fine-tunes itself to your voice in a similar manner to Dragon.
Being a Microsoft product, the programme is used mainly for navigating Windows but it is useful to dictate speech to text for pretty well any application with text input.
There were some issues – “We experienced intermittent trouble activating the program, which is initiated by saying, “Start listening.” To stop dictation, you say, “Stop listening.”
There are two dictation modes: the first one types text directly into text fields; the second mode consists of a dictation scratchpad that allows you to edit and approve dictated text before it’s inserted into the text field.
How was the accuracy? “We found the dictation’s accuracy lacking at first, but over subsequent tests, it improved. Out of our 300-word paragraph, Windows Speech Recognition missed an average of 4.6 words and punctuation was mostly accurate, with a few missed commas and periods.
- Microsoft Dictate
The Microsoft application, recently released as an add-on to MS Office, was the worst of the four tested programmes. MS Dictate is a free transcription app like Google’s, although lacking the latter’s accuracy (which itself is less than ideal). It suffers however from experiencing difficulties launching the programme, with some commands failing recognition or being mistaken for being part of the transcription.
Dictate is currently only compatible with Word, Outlook and PowerPoint and uses Cortana’s speech recognition software to transcribe, which requires your computer to be online in order to access the Cortana network.
Like the Google app (below) it has no voice calibration nor tutorial on use despite being simple to install.
“Out of 300 words, it missed an average of 14.6 words. The software misunderstood several words. Plus it was the only program we tested that typed out nonsensical words and sentences. There is a manual punctuation feature that you can toggle, but even with this off, punctuation was often missed or even spelled out in the copy such as “period”” and “comma.” Overall, we found Dictate’s transcription ability unsatisfactory and would not recommend it in its current state.”
- Google Docs
Google’s product is more of a “bare bones” dictation tool. Their voice typing feature that can be activated through the toolbar or by using Ctrl+Shft+S.
The tool doesn’t require any setup or calibration, which results in a less accurate result in the accuracy test.
The fact that the tool is simple and basic means that it also requires little instruction to use and instruction in any event is also basic with little real assistance in case of those who might require assistance.
As a result, the accuracy is less than ideal and not particularly suited to legal use where accuracy and speed are important.
“With an average of 6.3 missed words in our testing (out of a total of 300 words) punctuation was mostly accurate, but some punctuation isn’t recognized by the program. For instance, in one sentence, rather than inserting a semicolon, it transcribed the word “semicolon.” It featured other quirks that made our copy look sloppy, such as capitalizing certain words for no apparent reason and inserting unnecessary spaces. There’s no way to calibrate or tune the program to your voice to improve accuracy.”
“Overall, we found the simplicity to be a double-edged blade with the quick activation convenient, but no other options available to customize the experience. The ease of use could be improved with the addition of guided instruction.”
- Dragon Legal NZ
Dragon came out as the most accurate of the programmes, as mentioned, and missed just two words during the test. The only other errors were some minor missing punctuation, but it was otherwise a ‘spot on’ result.
Part of the reason for the accuracy, surely a key factor for any lawyer to consider, is the programme’s extensive ‘accuracy tuner’ which runs when the software is first installed and permits the system to become fully acquainted with the user’s voice. It can also be run any time a performance enhancement is sought.
It is also simple to use, extremely well supported by tutorials and supplier support and carries a raft of additional features.
Of course it is also the most expensive, but in the context of what this powerful new software achieves, the cost is minor.
For instance, the New Zealand Legal edition is set up to handle a raft of New Zealand-specific statutes, regulations, place names, Government entities and even Te Reo. Thus, the ability to powerfully integrate with any law practice creates something of a no-brainer option.
For Further Information Refer Here: