f-log

just another web log

26 Apr 2019:
revive a vive in just four hours
Back in August 2016 I got access to an HTC Vive VR system. Had lots of fun with it, including showing it to a Princess.

Since then it has been out a couple of times, but for at least a year, maybe two, it has been boxed up, out of the way. The lighthouse system needs a lot of space and the setup and put-away process is not quick.

Then I got the opportunity to set it up for a play with my boys. Unpacked all the hardware;

Vive link box
Vive Lighthouse x2
Vive Controller x2
Tripod x2
4-way power adaptor x2
Vive sync cable (very long)
headphones
Vive charger x2 (for controllers)
Vive headset
Various cables

Turned on the computer and;
Windows 10 update
Steam update
HTC Viveport update

HTV Viveport seems to be HTC's own VR store, but as all my content is on steam I quickly dismissed it. It was quite insistent that I sign up for a monthly or yearly plan and I can see some users thinking that is the only way to use the Vive.

Then all the Steam VR games wanted to update. Then the C: drive ran out of space. Find out you cannot move Steam games if they are pending an update.

Then Steam VR wanted an update.

Ready to play?

NO!

Steam VR wanted to update

Vive headset
Vive Lighthouse (each separately)
Vive Controller (each separately)

Now running Steam VR reported the Graphics driver was out of date !

Find that the NVidia GeForce Experience that should be doing updates wants me to create an account and the internet confirms nobody likes that. Download NVidia drivers separately.

Try and install drivers. After running for a bit reports that another process is updating and to try again later!

Windows is doing MORE updates!

AGHhhhh!

A reboot later and I can update the graphics driver and finally play a game.

That took just under four hours!

We had less than an hours play time before I had to pack it all away again.

Still, it will be quicker next time ;)
21 Apr 2019:
icmp magic to enliven mote enlightenment
Last time on my quest for mote perfection I said I wanted to link the LEDs to my Toilet roll security camera

Well, I realised that having the Motes manually setup with Python meant I had to manually turn them off. I could leave them on 24x7, but that does not sit very well with me.

I wanted them to come on when I was at the computer and then turn off when I was not there. Additionally, I wanted that process to be fading not binary on/off.

Started off with some experiments into colour space in Python. This was much much simpler than I expected.

import colorsys
H, S, V = colorsys.rgb_to_hsv(R, G, B)
V = V * brightness
R, G, B = colorsys.hsv_to_rgb(H, S, V)

Where brightness is a floating point number between 0.0 - 1.0

So that was the fading/brightness taken care of. How to get a signal from the Toilet roll raspberry pi camera to the Mote controlling Raspberry Pi?

Ping! I really did not know how easy it would be to create a ICMP/Echo server, but I was really interested to find out. Python, as always, makes this almost a doddle.

import sys, socket

def listen():
try:
    s = socket.socket(socket.AF_INET,socket.SOCK_RAW,socket.IPPROTO_ICMP)
    s.setsockopt(socket.SOL_IP, socket.IP_HDRINCL, 1)
except socket.error, err:
    print err
    print "run with sudo"
    sys.exit(2)
s.settimeout(10)
try:
    data, addr = s.recvfrom(1508)
    print "Packet from %r: %r" % (addr,data)
except socket.timeout:
    print "timeout"

listen()

FYI, that code is Python2 the final code was converted to Python3 by fixing the print statements and changing the socket.err, err: to socket.err as err:

Running via sudo then allows the program to intercept the ICMP/Ping messages. Sender still gets proper Ping responses.

Now every time the program received a Ping message it would increase the brightness and a timeout of not receiving a Ping message would decrease the brightness.

To augment the Toilet roll raspberry pi camera code all I had to do was add
os.system('ping -c 1 pimote')

into the PIR state change, remember the code checks to see if the camera should be taking pictures, it is always monitoring for PIR changes.

Latest mote code is on github https://github.com/robgithub/undershelflighting

noticed the Toilet roll raspberry pi camera still has my secret keys embedded, naughty me! So, NOT on github.

Incidentally the Toilet roll raspberry pi camera has been running very happily for many years, but was not running last night. Tracked it down to having crashed when I moved everything around on my desk, must have knocked the SD card. They really stick out of the old Pis!
19 Apr 2019:
Tales of 1.3in lcd woe ftw
Got this cool little LCD panel for less than £10
Photograph of a Waveshare 1.3inch LCD Module connected to a Raspberry Pi showing a custom image on the LCD
It is 240x240 (HD) 1.3in: RGB, 65K colours
Yes, it really says HD next to the dimensions on the product page

It seems I am the only one that ever has issues getting things to work, so get ready for another rambling tale of woe, but I get it working in the end :)

In the box are;
LCD mounted on PCB
Cable with plug to PCB and individual wires to the Pi
Welcome note

Welcome note says to go to
https://www.waveshare.com/wiki/
to download the drivers/manual.

Problem no 1.
The Wiki page has hundreds and hundreds of products.
Search in the Raspberry Pi section, nope.
Search the Display section, nope.
Eventually find 1.3inch LCD Module in the OLED section, even though it is not an OLED screen.

Great, now download the User Manual
Quick flick down the 14 page PDF and ... Ah! a neat and clear diagram with all the wires connected to the Raspberry Pi
And, it's colour coded, bonus!
Plug red one here, black there, green, yellow etc ...
Hmmm, there is a purple cable in my wires and no purple on the diagram ... But there is a maroon cable. As all the other cables have been colour accounted for that must be my purple.
...
do all the software - more on that later
...
nothing works :(
Find that all the coloured cables in my hardware are connected to different output pins on the LCD circuit board!
Right, so the red cable maps to the maroon, the black to white etc ...
Eight cables later and it works!

Now all the non wiring things I had to do to get it to run the demos.

Always update your Pi before starting a new project.
sudo apt-get update
sudo apt-get upgrade


Get the source code package.
wget https://www.waveshare.com/w/upload/5/5d/1.3inch_LCD_Module_code.7z


and unpack it. Rather oddly, the documentation instructs you to download to a PC/MAC, unpack it, copy it to the boot partition on the SD card and then boot the Pi and copy it off again.
p7zip -d 1.3inch_LCD_Module_code.7z
Please note that you need to
sudo apt-get install p7zip-full
as
sudo apt-get install p7zip
managed to unzip most of the files but error-ed one one.

Now it wants us to install the chip-set drivers...
No link in the documentation...
No link on the wiki page
...
Google ...
Found Libraries Installation for RPi on Waveshares wiki, hidden, of course.
wget https://www.waveshare.com/w/upload/d/d8/Bcm2835-1.45.tar.gz
tar zxvf Bcm2835-1.45.tar.gz
cd bcm2835-1.45/
sudo ./configure
sudo make
sudo make check
sudo make install

Not sure that we need all those sudos, but when "make" is run without "sudo" it errors. It is just "make" in the documentation :(

Also on that page is instructions to enable the SPI interface. You can ignore everything else on that page, it is generic to lots of their hardware.
sudo raspi-config
enable SPI and then reboot reboot

We have the demo code and chipset libraries, enabled the SPI interface, now we need the language libraries.

Wiring Pi; this just worked
sudo apt-get install git
sudo git clone git://git.drogon.net/wiringPi
cd wiringPi/
sudo ./build


Python sort of worked, this was in the documentation
sudo apt-get install python-pip
sudo pip install RPi.GPIO
sudo pip install spidev
sudo apt-get install python-imaging

but also needed
sudo pip install numpy
sudo apt-get install ttf-wqy-zenhei


Finally we can build and run the demos.
cd code/
cd RaspberryPi/
cd bcm2835/
make
sudo main


This shows a few bits of text, a mini logo and then a full screen image of Sponge Bob Squarepants.

cd ..
cd wiringPi/
rm main
make
sudo main


exactly the same as the last demo.

cd ../python/
sudo python main.py

Shows some text and then a large logo.

Now what?
Change the pic.jpg in the python folder to a 240x240 JPG of my own design. Created in GIMP
Rob on Earth 240x240 logo

Overall it was a bit of a pain to get working, but for the price I think it is a really nice, fun package.
In my non scientific Python based tests I got 4 fps. I did try and use the the non Python code to do the same but crashed the display, which the manual warns you about if you are trying to mix Python bcm2835 and wireingpi. Reboot fixed it.
09 Apr 2019:
uv project from view broken down just for youtubers
Apology to anyone who has already seen this on YouTube and came here to find out more, I got distracted ;)



I had been thinking about this idea for a couple of years and just never got around to to trying. To be fair, I was not 100% sure it would work.

The basics are simply take a Mesh in Blender and UV unwrap it using the Project from View method. Once this is done wiring up an image texture will appear as if the view/camera is projecting the image directly onto the Mesh.

I think there maybe some more stuff I can do with this, but for this experiment I wanted to run a physics simulation, wait until all the Meshes were settled and then UV unwrap using Project from View.

Playing back the simulation would appear confusing from a texture point of view until, once again the Meshes settled.

Obviously the Camera had to be in the same position it had been when the UV unwrap using Project from View had been used, but I got tripped up on one other thing.

ENTROPY!

Blender s physics has random variations that are often very subtle. I would UV unwrap using Project from View on a completed simulation, only to run it again and find Meshes in slightly different/wrong places, breaking the illusion. This fix was to Bake the physics so nothing changed the route each Mesh would take.

I did wonder if this needed a full tutorial with voice over, like my light titles video. In the end I opted just for this demonstration as it is unlikely many people will want to replicate it.

I got use my The Persistence of Pi image as the texture. This worked really well due to the mix of colours and textures.

I rendered the bowl separately as a single image with and without the UV and then used the VSE to combine the results without having to render the results twice(it took a while at 1920x1080). This caused an issue with the ball animation overlapping the bowls. So, I rendered a separate image for the front side of the bowl, with and without the UV. With this set as the top most layer, you believe the result is a fully rendered simulation in a bowl.

To emphasise the experience I created an animation and used that as an Image sequence as the UV Texture. This time the Camera changes position, trying to show the Image Sequence is not a Lamp projecting the Texture. That is why the plane flies across the simulation. No, I do not think it worked out particularly well :(

Did have a lot of fun making the animation though. Blender Text as a closed and Extruded Mesh with an Emitter set to Grid and Volume. It needed a few tweaks, but then the Glare/Ghosts Node in the Compositor made it come to life.

Now, UV unwrap using Project from View on each Mesh might have been feasible with the 32, but the dropping blocks animation has 200 Meshes. We need a touch of Automation.

Blender UV unwrap Project from View selected objects script
(Gist created with https://github.com/ceremcem/create-gist/blob/master/create-gist.sh)

and as with my previous Blender scriptings, very little is written by me. It basically loops through all the selected objects and for each Mesh runs a method. This "unwrap_from_view(TARGET)" sets the Mode and the Selection(for UV unwrapping) then tries to find;

All the Windows and associated Screens
All the Areas that are set as 3D view
All the Regions that are type Window

and use that information to create a Context that can be passed to bpy.ops.uv.project_from_view as an override to default context.

Contexts are a big deal in Blender scripting and I would have never been able to decipher all that by myself. Luckily the internet provides :D

The bpy.ops.uv.project_from_view wants the user to be clicking the Unwrap selection from a 3D window, that's how it knows which view to set the UV projection from. For my script I fake all that and force it to unwrap for EVERY 3D view, so it is best to only have one :)

It is a fun effect and even more so for being automated.
08 Apr 2019:
completed undershelf lighting project
Completed what I needed for my under shelf lighting with the Pimoroni Mote set

code https://github.com/robgithub/undershelflighting

Few notes;

The Mote sticks with the USB cables plugged in are flush to the surface. I spent ages trying to over hang the plug thinking it was thicker than the Mote stick.

I created all the code(see above) to allow me to set-and-forget the state of the Mote sticks, but on reflection I may just opt for the web based server in the examples.

The Mote state is stored in a Python pickle (read serialised binary file)

Brightness is not something that can trivially be set for non white LEDS. Would have been nice to add to my code, but completely over the top vs time.

I want to link the Mote sticks to my trusty toilet roll camera
photo of rasperry pi with camera in toilet roll

not sure when that is going to happen.

Just chill'n with my lights now :)
08 Apr 2019:
reference
My reference for creating a GitHub fork of an open source project, creating a branch and submitting a Pull request.
This example is to create the branch fixgetpixeluse for the Pimoroni Mote

Got to source repo and create fork on GitHub

Clone the fork locally

git clone https://github.com/robgithub/mote.git


Enter the project folder

cd mote/


Checkout the master branch

git checkout master


Create a new branch to work on

git branch fixgetpixeluse
git checkout fixgetpixeluse


Do work

Check work into branch

git add *; git commit -m "Added brightness parameter as fix"


Push change to my fork

git push --set-upstream origin fixgetpixeluse


Goto GitHub and source repo and create Pull request.

...

Profit! (it is a relative term when it comes to open source, if everyone gets the benefit)
05 Apr 2019:
the simple things in life are not in the kernel sources
I ended the last post with the words "nothing is ever simple" ... or is it?

First up; it turns out no one can clone the kernel sources on a Raspberry Pi runs out of RAM. There are options to fake extra RAM, but in the end I downloaded the .ZIP from github.

The download took about 20 seconds, but unzipping then took about five minutes. Well there are 62,223 files in there :D

Found the reference to "Indeed it is in host mode", but the number 00041d01 is just the port descriptor.

Then I went back to trying to disable OTG ...


linux-rpi-4.19.y/drivers/usb/host/dwc_otg/dwc_otg_driver.c

has a kernel parameter dwc_otg.cap which can be set to None, so lets try that.

sudo vi /boot/cmdline.txt


dwc_otg.cap=2 dwc_otg.lpm_enable=0 console=serial0,115200 console=tty1 root=PARTUUID=3ae9fe6d-02 rootfstype=ext4 elevator=deadline fsck.repair=yes rootwait

reboot and .....


Indeed it is in host mode hprt0 = 00021d01


which should mean it works, but no devices in lsusb and no errors :(

Wait a minute. The Mote LED sticks are attached by USB micro cables. I wonder ...

That worked, so in the end all I need to do is order and extra cable :(
https://shop.pimoroni.com/products/mote-module-cable

Feel stupid much?

Now I have to write my state based LED control tools and mount the sticks under my shelves.
04 Apr 2019:
into the pi mote badger set never to return
I Got a Pimoroni Mote kit and now I am downloading the Raspberry Pi Kernel sources.

What? Why?

Well, are you sitting comfortably, because this is going to take a while...

I recovered a Pi Zero W from my unused/unloved super computer with 16gb sd card.

Flashed the latest Raspbian lite onto it.

time dd bs=4M if=/home/user/2018-11-13-raspbian-stretch-lite.img of=/dev/sdd status=progress conv=fsync


1866465280 bytes (1.9 GB, 1.7 GiB) copied, 131 s, 14.2 MB/s
445+0 records in
445+0 records out
1866465280 bytes (1.9 GB, 1.7 GiB) copied, 190.474 s, 9.8 MB/s

real    3m10.651s
user    0m0.010s
sys    0m2.744s

(destination drive /dev/sdd not partition)

mounted the sd card and setup wifi

cp wpa_supplicant.conf /mnt/sdd2/etc/wpa_supplicant/wpa_supplicant.conf

(destination second partition)

Setup hostname

echo "pimote" > /mnt/sdd2/etc/hostname

(destination second partition)

Enabled ssh

touch /mnt/sdd2/.ssh
touch /mnt/sdd2/ssh
touch /mnt/sdd1/ssh

(destination first partition)

.ssh is the local users ssh folder and /mnt/sdd2/ssh is not read on boot up. So after a few false starts that last one enable ssh.


ping pimote
ssh pi@pimote

It lives!

A quick peruse of Pimoroni's Getting started guide and it was, on with the show.

If you trust Pimoroni then the next step is pipe a script to Bash

curl https://get.pimoroni.com/mote | bash

I, instead downloaded the mote file and had a look through. 1120 lines later and I was convinced it was just boiler plate installer stuff.

It ran for a while downloading and installing stuff and I opted for the extra examples and documentation.

There are few different Python scripts in the examples folder, including one called test.py which seemed the best to try first... and it errored :(

The other scripts run without issue and showed a bright colourful light show. Very bright! and that was without the option external power supply!

At this point I wanted to know why test.py failed while all the other scripts worked.

and down the rabbit hole I went(Part 1)


Traceback (most recent call last): File "test.py", line 41, in <module> r, g, b = [int(c * 0.99) for c in mote.get_pixel(channel + 1, pixel)] ValueError: too many values to unpack


It took a while, but I found by trial and error commenting out lines and swapping things about that the line mentioned expected three values to be returned (r,g,b), but was returning four.

This was duplicated in the module itself, where it had the same code as an internal test.

I then cloned the Pimoroni repo

setup git

made the change and tested it

sent a pull request
https://github.com/pimoroni/mote/pull/28

and in theory the Pimoroni staff can accept my fix and no one else will be effected in the future.

Hooray for Open Source!!

But why are you downloading the Kernel sources if it was all fixed?

So, at this point I had code that changed the Mote colours on all four sticks and when the program ended the Mote sticks stayed lit.

I coded a very quick Python script that accepted a --stick parameter along with --red --green and --blue.

Tested stick one worked perfectly. Tested stick two and not so good, same with three and four :(

After quite a bit of debugging and trying different ways to do the same thing I read the docs...
http://docs.pimoroni.com/mote/_modules/mote.html#
which refused to scroll in FireFox so I then had to report that(also tried debugging, seems to be out of date jQuery library)

Which lead me to reading the Mote module source and learning that the Mote supports 1-4 Mote sticks, but does not address them separately.

When sending colour information to the sticks that data is a serial blob/buffer that includes one sticks worth if you are only using the FIRST stick, two for the first two, and so on. So to change stick four to Red you have to include buffer data for sticks 1,2 and 3 as well!

Drat! I wanted to set them and exit independently.

I resigned myself to having to write a status store so that updates would include the already set stick buffers and moved onto where to mount the lights in my shelves.

Rather than a rabbit hole this next one is a badger's set(part 2).

The Raspberry Pi Zero has a USB Micro B socket and the Mote cable is USB Type A. I had added an adaptor and the Mote worked. Now I want to be rid of the ugly, fat adaptor (that was pushing my power cable).

There is no such thing as a USB Micro B male to Micro B male cable anywhere to be found other than an OTG cable. How different can it be?
Google USB OTG wiring

and the cable I got delivered did not work :(

But wait! The Pi sees the cable connection. Maybe I can convince it to disable the OTG and just work with USB?

no and I can find no one in the world who has ever tried this.


Indeed it is in host mode hprt0 = 00041d01


So now I am trying to find out what 00041d01 means, but the cloning the kernel sources is not working out so well.

pi@pimote:~/src $ git clone https://github.com/raspberrypi/linux.git


Cloning into 'linux'...
remote: Enumerating objects: 6950699, done.
error: index-pack died of signal 9950699), 1.90 GiB | 16.00 KiB/s    
fatal: index-pack failed


nothing is ever simple.
loading results, please wait loading animateloading animateloading animate
[More tags]
rss feed

email

root

flog archives


Disclaimer: This page is by me for me, if you are not me then please be aware of the following
I am not responsible for anything that works or does not work including files and pages made available at www.jumpstation.co.uk I am also not responsible for any information(or what you or others do with it) available at www.jumpstation.co.uk In fact I'm not responsible for anything ever, so there!