android - Using UiAutomation from an accessibility service -


i writing accessibility service android aims @ providing different alternatives people physical disabilities control device, instance, using switches, scanning, head tracking, , others.

currently, perform actual actions on application's interface use accessibility api, accessibilitynodeinfo.performaction() method. works fine of time, found important restrictions:

  • most keyboards (ime) not work. had success google keyboard on lollipop (api 22) , had use accessibilityservice.getwindows(). lower api versions had develop special keyboard (undoubtedly not optimal solution).
  • most games not accessible. dot. not export accessibilitynodeinfo tree.
  • web navigation not practical (no scrolling, among other issues).

a solution use different api perform actions, , seems android.app.uiautomation fit purpose. according documentation "it allows injecting of arbitrary raw input events simulating user interaction keyboards , touch devices" looking for; although understand uiautomation intended testing purposes (and, perhaps, not ready production quality code) , that, perhaps, might not behave same on different devices. understand such api might be security hole if application use it. seems reasonable allow accessibility service use uiautomation given accessibilitynodeinfo.performaction() provides similar "powers".

so, tried following inside accessibility service:

 instrumentation = new instrumentation();  uiautomation automation = i.getuiautomation(); 

but getuiautomation() returns null.

is there option use uiautomation (or similar api) inside accessibility service?

btw: rooting device not option cannot inject events through screen driver


Comments

Popular posts from this blog

twig - Using Twigbridge in a Laravel 5.1 Package -

jdbc - Not able to establish database connection in eclipse -

firemonkey - How do I make a beep sound in Android using Delphi and the API? -