Eye tracking/video system demo!
Thanks for all the support!! I never thought this project would gain such attention...❤️
Video over VNC runs at low framerate and causes very visible tearing. Second video shows how I'm able to see👀
#fursuit
#openmv
#dragon
#raspberrypi
Hey there all of you lovely furs, dragons and bots, thank you for following! Didn't even realize I've broken 3k and it's all thanks to you :3
Just wanted to remind you this
#FursuitFriday
that all of you matter in this current world of madness <3
Been a while...
Didn't give up.
Somehow pushed through the last couple of months.
Graduated today and can proudly wear this stupid hat x3
HUGE thanks to all of you following... and happy
#FursuitFriday
!
This
#dragon
is tired but ready to move forward with this project now.
Ayy it's friday again, tho this pic is old as this poor guy is under surgical operations at the moment :3
Waiting for a new smaller camera unit to arrive, replacing the old and very visible one on this derps nose.
#fursuit
#FursuitFriday
#dragon
You just saw pair of dragons and a dino (anchiornis) blocking the hallway🔥
Had a blast at the con yesterday, met amazing people and made new friends!!
🐲 @/DragonParticle
🦖
@TerniBird
📸 @/randomizer0
#p
örröcon
#furry
#fursuit
#dragon
Fun little POV fpv camera project first prototype. Runs on RPi Zero W2 along with ov5647 camera sensor.240x240 ~60fps with ffmpeg. The screen (ST7789) utilizes fbcp-ili9341 display driver. Latency isn't that bad but the display sucks pixelwise :d
#rpizero
#st7789
#povcamera
#fpv
@Mascotdom
Camera (OV5647) coupled with rpi zero 2w running debian11 without x server. The camera input is directly fed to fb0 with ffmpeg complexity filering enabled on 35fps 454x454. The A53 chip keeps up with the video nicely 1/1x but the hardware has some latency around 20-50ms.
@yanadsl
And it's possible to render 2 images from one cam for extending display modules but copying module was cheaper and faking the stereo from one cam is pretty resource intense aka impossible for rpi zero :3
@Feronium
Used an array of 3 small ~70mA leds that have max 40 mW/sr radiant intensity. I highly recommend checking this or some other eye safety risk assessment
Also set the camera sensor exposure/contrast registers high to filter out unwanted shadows.
@Mascotdom
Not documented yet, the paws/claws are still in progress which will have a wireless controller for eye modes, fan speed, etc.
I'll probably dump everything to some repo once it's all done.
This album... it gets better on each time and makes me shed a tear again and again 🥹
I rarely post stuff outside fursuit things but this album... hit so hard and had me realize how I haven't felt happiness like this for some time till now.
Thank you
#wintersun
for
#time2
🤘🐲
@prawnzo_
Being a furry and keeping it as a secret from the true irl friends is kinda thrilling, and tbh in the end they would be fine with it anyways😅
@FoxBoyeLocksley
Quite simple, eye contours and slids are just pngs converted to RGB hex arrays and loaded from flash. I used teensy4.1 and a modified teensyeyes project from github which is based on the adafruit uncanny eyes.
@JcoolShipb
Yep, this first version only has one screen for left eye. RPi -> mipi -> 1x oled. Next version will have 2 displays but it's going to be a challenge, since there isn't too much space for optics and eye tracker.
@AtsushiaiH
It's teensy/GC9A01 and a bit of code from github + eye tracker (via uart), i'll be doing a refresh of the head in coming months with dual display vision. Will probably do a writeup then
@gamehowlers
I will probably do a writeup after doing the second version. Shortly: Openmv camera module provides the tracking data via uart to teensy4.1 that runs the eyes. You can check TeensyEyes from github made by chrismiller.
@Mascotdom
Camera (OV5647) coupled with rpi zero 2w running debian11 without x server. The camera input is directly fed to fb0 with ffmpeg complexity filering enabled on 35fps 454x454. The A53 chip keeps up with the video nicely 1/1x but the hardware has some latency around 20-50ms.
@LolouTheFox
Sounds about nice idea. I would prefer using "already existing" data if that's available via any api/web solution but there also processing of that (images to data structures) so it might need separate cloud instances to run. Not sure how legal that would be tho :D
@Mascotdom
Camera (OV5647) coupled with rpi zero 2w running debian11 without x server. The camera input is directly fed to fb0 with ffmpeg complexity filering enabled on 35fps 454x454. The A53 chip keeps up with the video nicely 1/1x but the hardware has some latency around 20-50ms.