• 0 Posts
  • 7 Comments
Joined 5 months ago
cake
Cake day: February 11th, 2024

help-circle
  • Thanks for the question, it actually made me look for the api. Looks like I misremembered it, and there aren’t actually any exposed APIs for developers regarding attention. Internally it’s used by iOS for checking when you’re looking at the screen for faceID and keeping the screen lit when you’re reading.

    There are APIs for developers that expose the position of the users head, but apparently it excludes eye information. Looks like it’s also pretty resource intensive, and mainly for AR applications.

    The faceID / touchID api essentially only returns “authenticated”, “authenticating”, and “unautheticated”. The prompts / UI are stock iOS and cannot be altered, save showing a reason.




  • Non- tech: I’m a psychiatrist, generally working with offenders in hospital and prisons. The clinical work is always interesting, and im usually thankful for openness at which people spill their life stories to me.

    Tech: I’ve kinda thought myself software development since I started working as a doctor. There’s just too much inefficiencies in the way we work clinically day-to-day due to the sheer amount of defensive practice inherent in the health system. Started off with personal tools to “assist” the electronic systems in place. But since then I’ve launched and maintained a number of digital clinical tools in a few local hospital which I’m pretty proud of.