Consider the mouse. On paper, the mouse as an input peripheral is a horrible idea. You have to take one hand off your keyboard: this significantly lowers your typing speed. You can't see what your mouse hand is doing at the same time as looking at the cursor its controlling: this makes it extremely difficult to perform fine control with it. The space it operates within is generally smaller than the space the cursor functions in, which means there's a significant hit in mapping resolution. It would be a much better idea to design your user interfaces in such a way to only use a keyboard, perhaps by using extra keys for the functionality you would have wanted the mouse to perform.
Thing is, that's all true, and its also all irrelevant in ways no one could foresee until people actually started designing around a mouse. It opens the door to the graphical user interface, which is a quantum leap beyond text-based console interfaces. And yet even after the mouse revolution should have taught people better, the exact same thing happened with touch interfaces. They have all of the problems their detractors claimed it had: it just didn't matter in the new paradigm they launched.
Sometimes the reason we can't think of a good way to do something is because there's genuinely no good way to do it. But I believe one day someone is going to crack that nut with smartphones, or at least tablets and phablets, and games we traditionally refer to as triple-A class MMORPGs. And when they do, it'll seem ridiculously obvious. It just doesn't seem like it now. The wheel as a transportation machine seems like an obvious invention also, and yet humans were using wheels for pottery production for hundreds of years before someone got the idea to use it for transportation. I can almost imagine a pottery wheel manufacturer rolling one to its delivery site, as pottery wheel makers had done for dozens of generations, and then suddenly eureka.
the downside of taking one hand off your keyboard/screen and thus lowering input ability was solved by Gumpei Yokoi when he designed the first button-based hand held games. the problem with keyboard and mouse is they are separate, but if you merge them into one unit side by side or on top of one another then the proximity of them solves the problem, Further more, the problem with tablets is that they operate on a touch screen, in order to properly use the touch screen one hand must hold the device while the other uses the screen, or you must put the device down (as I've seen many many times) and then proceed to use both hands to work the screen, however in doing so you are taking a handheld device and making it leave your hands where it is no longer mobile or handheld any more so than a laptop.
Nintendo fixed this with the NDS by giving buttons on the sides of the touch screen, yet leaving the screen in thumb range. hands did not have to be taken off the device nor did it have to be put down to use the touch screen, at least that was the goal, however the stylus negated this effort forcing people to hold the device with one hand and remove the other hand from it to either touch the screen or use a stylus.
the real problem with touchscreen devices is they are designed for people with 3 arms and hands. either 1 hand to hold it and two to work the touch screen, or two hands to hold it and use the buttons/touchscreen and one to use the stylus.
the google glass, attempted to solve this problem using eyes as the input method on a tablet or phone like Operating system, problems arise with eye tracking software, unless the glass is bolted onto your head, you constantly have to redo the calibration, besides that the system only functions if your pupils are focused on the target, if you relax your eyes the pupil dilates too much and the glass can't figure out where you are looking. these are just a few problems with the glass, the main problem is though we have 2 eyes they work together so in reality they are one input device, they can only focus on one thing at a time.
The question here isn't how to perfect touchscreens, the best usage of a touch screen was the NDS as a /Secondary/ input device to /buttons/ much like a mouse is used with a keyboard. the goal should become how to do most efforts with less buttons and use the touch screen for things buttons can't handle.
Or, why use touchscreens at all? light-gun technology is far quicker and more responsive and more accurate plus it works with any screen, so you don't need expensive touchscreens, you can use older cheaper more durable screen types. plus by working with any screen it removes the negative aspect of 'touchscreen slips' where your sleeve or wrist accidentally presses the screen wrong ETC..
Light gun technology AKA like the Wiimote is the future.