NLnet grant announcement: https://github.com/Helium314/HeliBoard/issues/2226
NLnet project description: https://nlnet.nl/project/GestureTyping/

Swipe-o-Scope repository: https://codeberg.org/eclexic/swipe-o-scope

CC BY-SA 4.0 license: https://creativecommons.org/licenses/by-sa/4.0/

Article on data anonymization: https://www.science.org/doi/10.1126/sciadv.adn7053

Contact me:
My Mastodon: https://mstdn.social/@theeclecticdyslexic
My Matrix: https://matrix.to/#/@eclexic:matrix.org

Text based tutorial: https://github.com/Helium314/HeliBoard/wiki/Tutorial:-How-to-Contribute-Gesture-Data

It is recommended you install Heliboard through the F-Droid app, unless you know what you are doing!
How to install F-Droid: https://f-droid.org/en/docs/Get_F-Droid/
Heliboard on F-Droid: https://f-droid.org/en/packages/helium314.keyboard/
Heliboard on Github: https://github.com/Helium314/HeliBoard/releases

Gesture typing library links from MindTheGApps:
ARM64: https://gitlab.com/MindTheGapps/vendor_gapps/-/blob/fe250848941171fe339ca9a44bc9a42aefb0be7d/arm64/proprietary/product/lib64/libjni_latinimegoogle.so
ARM: https://gitlab.com/MindTheGapps/vendor_gapps/-/blob/fe250848941171fe339ca9a44bc9a42aefb0be7d/arm/proprietary/product/lib/libjni_latinimegoogle.so
X86_64: https://gitlab.com/MindTheGapps/vendor_gapps/-/blob/fe250848941171fe339ca9a44bc9a42aefb0be7d/x86_64/proprietary/product/lib64/libjni_latinimegoogle.so


Other ways to contribute:
Providing packaging scripts for Swipe-o-Scope:

  • for Windows
  • for macOS
  • for Flatpak. This is a big task; Swipe-o-Scope uses QT modules that are not currently supported by the KDE SDK (QTGraphs module). Swipe-o-Scope is also written using the PySide6 python library, rather than in C++. To build Swipe-o-Scope for flatpak, you are probably going to have to talk with KDE developers and the PySide6 baseapp maintainer. You will need them to update the SDK and the baseapp to support PyQTGraph. This is all in addition to needing to know a little about building flatpaks.
  • for Linux via means other than Flatpak (e.g. the AUR)

Providing input or code if you are knowledgeable about any of the following:

  • gesture typing using hand-designed algorithms… (bonus points if you have worked on a paper or product that you could help us make an open implementation of WITHOUT violating anyone’s intellectual property)
  • gesture typing using neural nets and constrained compute, such as on mobile devices without TPUs… (unfortunately you may not be able to contribute here effectively until we have the data collected, organised, and released at the end of the collection period)
  • the JNI in Android… (bonus points if you have a working knowledge of the AOSP Latin IME JNI library)
  • natural language processing for next word prediction, specifically comparing the suitability of a set of candidate words against one another… (either by ngrams or any other low-compute method available to a mobile device with no internet connection)
  • building a diverse small-to-medium sized multi-lingual corpus of natural language text we could legally use to simulate context… (not stealing copyrighted content in bulk, like certain companies.)
  • making desktop apps more user-friendly… (Swipe-o-Scope doesn’t currently give user feedback that would be helpful to anyone that doesn’t feel comfortable in a terminal)
  • being patient while performing thorough code review and audits of rust code… (the gesture recognition library is most likely to be in rust, despite me being more experienced in other languages, as I am making a bet on it still being popular in a decade or three)

Full disclosure: out of concern for copyright issues and code quality, we will not be accepting ANY LLM generated contributions to this project. Neither in the form of code, nor corpus text. Thank you!

If you work on another on-screen keyboard, have the ability to collect data from it, and you want to add to the data set - you can contact me about some more particulars of the file format. Not all requirements of the file format are obvious and should not be assumed!

We are still in early stages, and this project is likely to continue for quite a while. It may continue well past the end date of me getting paid by the NLnet. Don’t hesitate to reach out if you think you can help in some other way! We can use all the help we can get; gesture typing is a hard problem with a very high ceiling. Every little improvement matters!

Remember, even sharing this project around will be helpful at the moment. I can’t be boosting this all the places I maybe should be; I have to be working on code, and this video took long enough!

I thought it might get some reach when cross-posted here

  • AndrewZabar@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    ·
    9 days ago

    I like Heliboard except the autocorrect sucks balls. When I tried gboard - despite my reluctance - it was a pleasure. I really like Heliboard a lot but it’s infuriatingly bad when it comes to sensing what I meant to type.

            • INeedMana@piefed.zipOP
              link
              fedilink
              English
              arrow-up
              1
              ·
              8 days ago

              it’s infuriatingly bad when it comes to sensing what I meant to type

              I understand the context here was about keyboard figuring out which word they meant. So that would be Swipe, no?

                • AndrewZabar@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  8 days ago

                  I was referring to when my typing is not perfect, how it handles deciding what I meant to type. So like if I wanted to type this sentence, my fingers might have hit si likr uf I wsntrd ti tyow tuos semtebcd, my fingerd night have git Whereas on my iPhone or android with gboard it will have corrected all of it perfectly, the Heliboard app gets maybe 20% of it properly corrected, if I’m lucky.

                  If I could have the privacy and customizable aspects of Heliboard with the accuracy of gboard, I’d be happy happy.

    • limerod@reddthat.comM
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 days ago

      Have you tried cleverkeys. It’s autocorrect is good. There are some rough edges. But, the accuracy of recommendation made me switch for good.

    • JigglySackles@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 days ago

      It’s autocorrect works great for me. I never used gboard because google saw enough as it was. But it’s a damn sight better than the Microsoft swyft which I used for awhile. I was constantly fighting the autocorrect on that.

      With heliboard I had some initial issues, but it turned out that it was system level autocorrect fucking with me. Once I disabled that and just used heliboard’s it was (still is) excellent.

    • INeedMana@piefed.zipOP
      link
      fedilink
      English
      arrow-up
      6
      ·
      9 days ago

      It’s the same as regular gesture typing, you just type random words instead of messages. No need to speed-run ;)

      • southsamurai@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        ·
        9 days ago

        True, but for whatever reason, I didn’t think of doing it the way I usually type on lemmy or in messages, taking frequent breaks to rest things. I also didn’t think of just slowing down, but I didn’t figure they wanted idealized swiping, rather than actual day to day input style so I likely would have discarded the idea if I had thought of it.

  • Canuck@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    3
    ·
    9 days ago

    KDE is working on a unified on screen keyboard across all their desktop environments. Gnome has apsirstions for something similar for gtk. Intrgrating into these efforts?

    • INeedMana@piefed.zipOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      9 days ago

      No idea. But apart from Linux Phones, I don’t think KDE or Gnome see a lot of gesture typing

  • LiveLM@lemmy.zip
    link
    fedilink
    English
    arrow-up
    3
    ·
    9 days ago

    Fucking awesome, I’ll be sure to contribute as many words as I can!

    Also, I remember FUTO Keyboard also had a little web browser based game to collect swipe data at some point. Wonder if they’d be willing to share? 👀

  • undrwater@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    9 days ago

    Is florisboard’s (also on fdroid) gesture typing open? It’s not great, but it works…somewhat.