Inking Our Way Back to Nature: Q&A with MyScript

With implications not only for wearables, smartphones, and tablets but also for interactive displays in the enterprise, classroom, kiosks and more, breakthroughs on the digital ink frontier bear watching.

Editor’s Note “Visionary things are now within our grasp,” enthuses Gary Baum of handwriting recognition and digital ink management company MyScript. Baum, the company’s vice president of marketing, spoke with EECatalog following the company’s June launch of its enhanced developer portal. Edited excerpts of the interview follow.

Courtesy MyScript

Courtesy MyScript

EECatalog: What are some examples of how one of the technologies developed by MyScript Labs, Interactive Ink, can make a difference for users of digital displays and kiosks, or instructors and others using white boards?

Gary Baum, Vice President of Marketing, MyScript

Gary Baum, Vice President of Marketing, MyScript

Gary Baum, MyScript: Imagine, an interactive white board where multiple people in different locations are writing and where what is being written is recognized and edited simultaneously. Imagine using a large format stylus to handwrite on something larger than a 55-inch screen, and, whether you happen to be writing in cursive, doing a math equation, drawing shapes, connectors, diagrams or what have you, that information is captured, converted to a beautified digital format in a way that allows you to share with a classroom or co-workers in Word, PowerPoint, pdf, Google Docs and other document formats.

We are starting to see a pretty big upswing in the viability of digital ink, and this applies to everything—from wearables to smartphones to tablets to 2-in-1s. We are seeing it across the board in all computation areas. Even automotive is now using handwriting as a low distraction input method for the user while the vehicle is on the move. Digital ink interactivity is the next large leap forward.

EECatalog: What other trends are you seeing?

Baum, MyScript: Multi-modal input with the device is the future, with its value, in large part, being the consistency offered to the end user. Users are less and less willing to have a learning curve to use an input method or an application. They just want it to work, and the device must cater to the user. So, in the case of multi-modal input, whether you use voice, or handwriting, or keyboarding, the linguistic and recognition engines working underneath are the same. System development, integration costs and memory sizes are optimized. And the user gets very consistent feedback—not one feedback set for handwriting, another for voice, and still another for keyboarding. User satisfaction is therefore greatly improved.

Another trend is that the quest for mobility continues. Over seventy-two percent of workers in the United States work in a mobile environment according to IDC1—they have to be mobile. Increasing productivity by improving communications and transforming business workflows is rapidly occurring.  New, natural input methods such as voice, handwriting are becoming common.

The use of tablets has been increasing in the workplace. Take the case of someone who is in the field doing repairs:  Rather than writing on paper, going to an office, having someone transcribe information, or fax it and then transcribe it, then put it in a data base, with a digital ink technology they can write directly on the form, have it recognized, say, “yes, that is what I wanted to write” send the document, and it’s done. There are fewer errors and tremendous productivity gains in the workforce using this type of technology and as we move to ever-lighter, smaller devices, carrying a keyboard becomes cumbersome. Virtual keyboards obscure a large portion of the display. We are doing multi-modal, so the user can decide what is the best interface for what they want to do at that moment.

Taking the example just noted a bit further, if that repair person just wants to fill in a form, only input technology is needed. If, however, he or she wants to insert a word in a sentence, with Interactive Ink we can easily do that. Any other type of ink management, where you are simply displaying the ink strokes, you cannot. You would have to go back and maybe lasso some of the ink, move it aside, write something else, bring the lasso back and re-paste it. The user would have to manage the ink appearance. Interactive Ink frees the users from doing that and allows the users to focus on the task at hand. It brings a lot of productivity to portable devices—tablets; smartphones, 2-in-1s, and [to] the evolution of 2-in-1s as they become more mobile and are adopted at even greater levels into the workforce. A mobile worker today can take notes using digital ink and convert the notes to a digital document for emailing or sharing with a simple tap of the screen.

EECatalog: What else should our readers know about Interactive Ink and how its development came about?

Baum, MyScript: It is totally new technology. It does not exist anyplace else. The basis for it is understanding what is being written. Handwriting recognition is the essential ingredient. You must have that, or you cannot build on top of it. With the technology to understand the ink, then, when the user writes in cursive, for example, as the words are being written, translation into the digital world in real time, with one-to-one correspondence between the ink stroke or strokes that represent each character and their digital meaning. Therefore, when the user wants to cross something out, or break a word up, or add something at the end, the [same] techniques used with word processing work: carriage returns, backspacing, line breaks. This is accomplished with three simple pen gestures, one of which is the intuitive scratch out. A shift key isn’t needed because if the user wants to write a capital, they simply write a capital.

If I want to write a list, it understands lists and bullet lists; it understands diagrams. So, whatever I write, the system knows what is being written and then it allows it to be moved, edited, modified, all with the stylus and touch. It takes the ink and makes it vastly more powerful. Digital ink is now part of the daily workflow.

The technology stems from MyScript Labs, where individuals have been working in AI and neural networks for 19 years. We were one of the pioneers in bringing AI to bear on the challenge of handwriting recognition. Millions of handwriting samples are used to train the engines encompassing the languages for more than 90 percent of the world’s population.

AI is a good technology for handwriting recognition to mimic how a person interprets the meaning of what is being written. It is very complex, but we have brought it up to the level where application developers can now access that technology with our developer website portal. We try to make it as easy as possible to add digital ink interactivity to their application and we continue to enhance and evolve the tools to be more accommodating to needs of software developers and others who want to use handwriting recognition input methods and interactive ink technology.

EECatalog: What’s on the horizon for ink management?

Baum, MyScript: One of the drawbacks for digital ink users has been ink lag. It’s very unnatural for ink to flow out of the pen but be lagging behind the point. It is not what we are used to with pen and paper, so we don’t like it on a digital device. A tremendous amount of activity has been focused on reducing the lag of the ink [with regard to] the operating systems, the digital stylus, the hardware underneath, the controllers and so on. One of the technologies that is being applied, and that we are partnering with a technology leader on, is one where the ink coming off the pen is displayed to the screen for a short period of time—a few milliseconds to a few hundred milliseconds—and then that ink disappears as the application has time to receive and process and display the digital pen input.

That requires an interaction between the hardware, the platform, and the architecture—CPUs and system-level silicon and the application itself. We have to tell the hardware facility—what color, what type of line, what do you want to display, so that it can do this temporary ink and have it look natural. If it is effective, the ink appears to the user as instantaneous. Substantial work in the industry is occurring on this type of problem—we are going to be discussing it at the future.write(); conference MyScript will once again be sponsoring. Operating systems are also addressing this issue with various forms of digital ink prediction and optimizations of the underlying hardware architecture. The result is users can now enjoy a near pen-on-paper experience.

EECatalog: What else are you anticipating for this year’s future.write(); conference on digital writing and ink interactivity?

Baum, MyScript: We will have industry leaders talking about digital ink, digital ink management and interactivity, digital handwriting, device considerations for an improved user experience; the standards emerging for document and [ink] interchange; and stylus consistency across operating systems and manufacturers, to name a few topics.

EECatalog: How do you see the new developer platform you have launched being used to capitalize on the features of Interactive Ink?

Baum, MyScript: For example, for the developer who says, “I want to add handwriting input to my form based application,” [the portal] is a great place for them to go and bring that technology into their application. Or maybe you want to add search capability to a note-taking application that currently only understands ink. Now that ink can be fed into a recognition engine, interpreted, and you can create a data base of indexed terms such that searches become very easy. Contrast that to flipping through a pen and paper notebook or notebooks and trying to find something. Building apps where the user can interact and edit the digital ink will bring new value to the market that has not been possible until now.

EECatalog: What does MyScript want from the developer community?

Baum, MyScript: What we want from the developers is an expansion of the digital ink ecosystem. We want digital ink to become as pervasive as keyboard input. We believe that digital ink is a very natural way to interface with the machine. Interfacing with a device used to be a keyboard character based, then we had bit-mapped displays and we could add a mouse. Next, we could directly touch things and did not have to use a mouse—which made a lot of sense on small portable devices.

Now the ecosystem is poised for growth with Interactive Ink as we have now gained the ability to write on these devices effectively and get the benefit of understanding what is written and transporting that digital document to our daily workflow. What we would want from developers is to expand this capability to the world—to bring this power to more applications, to grasp that the technology is now in place, that digital active pens are very common; they are becoming cost effective. We are seeing them show up on devices targeted at students at very low cost; we are seeing them deliver very high precision—not only artists, but enterprise users performing note taking—there is no reason now, to not enjoy natural input, such as handwriting.  Interactive Ink elevates digital ink to be easy to use with all the power of digital computing applied.

1.     IDC Forecasts U.S. Mobile Worker Population to Surpass 105 Million by 2020, Bryan Bassett, Research Analyst, Mobile Enterprise, IDC, June 2015 updated August 2016

Share and Enjoy:
  • Digg
  • Sphinn
  • Facebook
  • Mixx
  • Google
  • TwitThis