If you don’t want to be tied to a video screen during home workouts, Llum Acosta, Samreen Islam, and Alfred Gonzalez shared this great Raspberry Pi–powered alternative on hackster.io: their voice-activated project announces each move of your workout routine and how long you need to do it for.
This LED-lit, compact solution means you don’t need to squeeze yourself in front of a TV or crane to see what your video instructor is doing next. Instead you can be out in the garden or at a local park and complete your own, personalised workout on your own terms.
Join us for Digital Making at Home: this week, young people can do stop motion and time-lapse animation with us! Through Digital Making at Home, we invite kids all over the world to code along with us and our new videos every week.
So get your Raspberry Pi and Camera Module ready! We’re using them to capture life with code this week:
When taking photos, most of us simply like to press the shutter button on our cameras and phones so that viewable image is produced almost instantaneously, usually encoded in the well-known JPEG format. However, there are some applications where a little more control over the production of that JPEG is desirable. For instance, you may want more or less de-noising, or you may feel that the colours are not being rendered quite right.
This is where raw (sometimes RAW) files come in. A raw image in this context is a direct capture of the pixels output from the image sensor, with no additional processing. Normally this is in a relatively standard format known as a Bayer image, named after Bryce Bayer who pioneered the technique back in 1974 while working for Kodak. The idea is not to let the on-board hardware ISP (Image Signal Processor) turn the raw Bayer image into a viewable picture, but instead to do it offline with an additional piece of software, often referred to as a raw converter.
The raw image is sometimes likened to the old photographic negative, and whilst many camera vendors use their own proprietary formats, the most portable form of raw file is the Digital Negative (or DNG) format, defined by Adobe in 2004. The question at hand is how to obtain DNG files from Raspberry Pi, in such a way that we can process them using our favourite raw converters.
Obtaining a raw image from Raspberry Pi
Many readers will be familiar with the raspistill application, which captures JPEG images from the attached camera. raspistill includes the -r option, which appends all the raw image data to the end of the JPEG file. JPEG viewers will still display the file as normal but ignore the (many megabytes of) raw data tacked on the end. Such a “JPEG+RAW” file can be captured using the terminal command:
raspistill -r -o image.jpg
Unfortunately this JPEG+RAW format is merely what comes out of the camera stack and is not supported by any raw converters. So to make use of it we will have to convert it into a DNG file.
This Python utility converts the Raspberry Pi’s native JPEG+RAW files into DNGs. PyDNG can be installed from github.com/schoolpost/PyDNG, where more complete instructions are available. In brief, we need to perform the following steps:
git clone https://github.com/schoolpost/PyDNG
pip3 install src/. # note that PyDNG requires Python3
PyDNG can be used as part of larger Python scripts, or it can be run stand-alone. Continuing the raspistill example from before, we can enter in a terminal window:
python3 examples/utility.py image.jpg
The resulting DNG file can be processed by a variety of raw converters. Some are free (such as RawTherapee or dcraw, though the latter is no longer officially developed or supported), and there are many well-known proprietary options (Adobe Camera Raw or Lightroom, for instance). Perhaps users will post in the comments any that they feel have given them good results.
White balancing and colour matrices
Now, one of the bugbears of processing Raspberry Pi raw files up to this point has been the problem of getting sensible colours. Previously, the images have been rendered with a sickly green cast, simply because no colour balancing is being done and green is normally the most sensitive colour channel. In fact it’s even worse than this, as the RGB values in the raw image merely reflect the sensitivity of the sensor’s photo-sites to different wavelengths, and do not a priori have more than a general correlation with the colours as perceived by our own eyes. This is where we need white balancing and colour matrices.
Correct white balance multipliers are required if neutral parts of the scene are to look, well, neutral. We can use raspistill‘s guesstimate of them, found in the JPEG+RAW file (or you can measure your own on a neutral part of the scene, like a grey card). Matrices and look-up tables are then required to convert colour from ‘camera’ space to the final colour space of choice, mostly sRGB or Adobe RGB.
My thanks go to forum contributors Jack Hogan for measuring these colour matrices, and to Csaba Nagy for implementing them in the PyDNG tool. The results speak for themselves.
Previous attempts at raw conversion are on the left; the results using the updated PyDNG are on the right.
For those familiar with DNG files, we include links to DCP (DNG Camera Profile) files (warning: binary format). You can try different ones out in raw converters, and we would encourage users to experiment, to perhaps create their own, and to share their results!
This is a basic colour profile baked into PyDNG, and is the one shown in the results above. It’s sufficiently small that we can view it as a JSON file.
Fly through the clouds in our re-creation of Konami’s classic 1980s shooter. Mark Vanstone has the code
Arguably one of Konami’s most successful titles, Time Pilot burst into arcades in 1982. Yoshiki Okamoto worked on it secretly, and it proved so successful that a sequel soon followed. In the original, the player flew through five eras, from 1910, 1940, 1970, 1982, and then to the far future: 2001. Aircraft start as biplanes and progress to become UFOs, naturally, by the last level.
Players also rescue other pilots by picking them up as they parachute from their aircraft. The player’s plane stays in the centre of the screen while other game objects move around it. The clouds that give the impression of movement have a parallax style to them, some moving faster than others, offering an illusion of depth.
To make our own version with Pygame Zero, we need eight frames of player aircraft images – one for each direction it can fly. After we create a player Actor object, we can get input from the cursor keys and change the direction the aircraft is pointing with a variable which will be set from zero to 7, zero being the up direction. Before we draw the player to the screen, we set the image of the Actor to the stem image name, plus whatever that direction variable is at the time. That will give us a rotating aircraft.
To provide a sense of movement, we add clouds. We can make a set of random clouds on the screen and move them in the opposite direction to the player aircraft. As we only have eight directions, we can use a lookup table to change the x and y coordinates rather than calculating movement values. When they go off the screen, we can make them reappear on the other side so that we end up with an ‘infinite’ playing area. Add a level variable to the clouds, and we can move them at different speeds on each update() call, producing the parallax effect. Then we need enemies. They will need the same eight frames to move in all directions. For this sample, we will just make one biplane, but more could be made and added.
To get the enemy plane to fly towards the player, we need a little maths. We use the math.atan2() function to work out the angle between the enemy and the player. We convert that to a direction which we set in the enemy Actor object, and set its image and movement according to that direction variable. We should now have the enemy swooping around the player, but we will also need some bullets. When we create bullets, we need to put them in a list so that we can update each one individually in our update(). When the player hits the fire button, we just need to make a new bullet Actor and append it to the bullets list. We give it a direction (the same as the player Actor) and send it on its way, updating its position in the same way as we have done with the other game objects.
The last thing is to detect bullet hits. We do a quick point collision check and if there’s a match, we create an explosion Actor and respawn the enemy somewhere else. For this sample, we haven’t got any housekeeping code to remove old bullet Actors, which ought to be done if you don’t want the list to get really long, but that’s about all you need: you have yourself a Time Pilot clone!
Get your copy of Wireframe issue 41
You can read more features like this one in Wireframe issue 41, available directly from Raspberry Pi Press — we deliver worldwide.
When we announced new keyboards for Portugal and the Nordic countries last month, we promised that you wouldn’t have to wait much longer for a variant for Japan, and now it’s here!
Japanese Raspberry Pi keyboard
The Japan variant of the Raspberry Pi keyboard required a whole new moulding set to cover the 83-key arrangement of the keys. It’s quite a complex keyboard, with three different character sets to deal with. Figuring out how the USB keyboard controller maps to all the special keys on a Japanese keyboard was particularly challenging, with most web searches leading to non-English websites. Since I don’t read Japanese, it all became rather bewildering.
We ended up reverse-engineering generic Japanese keyboards to see how they work, and mapping the keycodes to key matrix locations. We are fortunate that we have a very patient keyboard IC vendor, called Holtek, which produces the custom firmware for the controller.
We then had to get these prototypes to our contacts in Japan, who told us which keys worked and which just produced a strange squiggle that they didn’t understand either. The “Yen” key was particularly difficult because many non-Japanese computers read it as a “/” character, no matter what we tried to make it work.
Special thanks are due to Kuan-Hsi Ho of Holtek, to Satoka Fujita for helping me test the prototypes, and to Matsumoto Seiya for also testing units and checking the translation of the packaging.
Get yours today
You can get the new Japanese keyboard variant in red/white from our Approved Reseller, SwitchScience, based in Japan.
If you’d rather your keyboard in black/grey, you can purchase it from Pimoroni and The Pi Hut in the UK, who both offer international shipping.
One of our favourite makers, Pi & Chips (AKA David Pride), wanted to see if they could trigger a DSLR camera to take pictures by using motion detection with OpenCV on Raspberry Pi.
You could certainly do this with a Raspberry Pi High Quality Camera, but David wanted to try with his swanky new Lumix camera. As well as a Raspberry Pi and whichever camera you’re using, you’ll also need a remote control. David sourced a cheap one from Amazon, since he knew full well he was going to be… breaking it a bit.
When it came to the “breaking” part, David explains: “I was hoping to be able to just re-solder some connectors to the button but it was a dual function button depending on depth of press. I therefore got a set of probes out and traced which pins on the chip were responsible for the actual shutter release and then *carefully* managed to add two fine wires.”
Next, David added Dupont cables to the ends of the wires to allow access to the breadboard, holding the cables in place with a blob of hot glue. Then a very simple circuit using an NPN transistor to switch via GPIO gave remote control of the camera from Python.
David then added OpenCV to the mix, using this tutorial on PyImageSearch. He took the basic motion detection script and added a tiny hack to trigger the GPIO when motion was detected.
He needed to add a delay to the start of the script so he could position stuff, or himself, in front of the camera with time to spare. Got to think of those angles.
David concludes: “The camera was set to fully manual and to a really nice fast shutter speed. There is almost no delay at all between motion being detected and the Lumix actually taking pictures, I was really surprised how instantaneous it was.”
Here are some of the visuals captured by this Raspberry Pi-powered project…
Take a look at some more of David’s projects over at Pi & Chips.
One of our favourite YouTubers, Harrison McIntyre, decided to make the aphorism “a watched pot never boils” into reality. They modified a tabletop burner with a Raspberry Pi so that it will turn itself off if anyone looks at it.
In this project, the Raspberry Pi runs facial detection using a USB camera. If the Raspberry Pi finds a face, it deactivates the burner, and vice versa.
There’s a snag, in that the burner runs off 120 V AC and the Raspberry Pi runs off 5 V DC, so you can’t just power the burner through the Raspberry Pi. Harrison got round this problem using a relay switch, and beautifully explains how a relay manages to turn a circuit off and on without directly interfacing with the circuit at the two minute mark of this video.
Harrison sourced a switchable plug bar which uses a relay to turn its own switches on and off. Plug the burner and the Raspberry Pi into that and, hey presto, you’ve got them working together via a relay.
Things get jazzy at the four minute 30 second mark. At this point, Harrison decides to upgrade his single camera situation, and rig up six USB cameras to make sure that no matter where you are when you you look at the burner, the Raspberry Pi will always see your face and switch it off.
Harrison’s multiple-camera setup proved a little much for the Raspberry Pi 3B he had to hand for this project, so he goes on to explain how he got a bit of extra processing power using a different desktop and an Arduino. He recommends going for a Raspberry Pi 4 if you want to try this at home.
Join us for Digital Making at Home: this week, young people can explore the graphics side of video game design! Through Digital Making at Home, we invite kids all over the world to code along with us and our new videos every week.
So get ready to design video game graphics with us:
Fancy tracking the ISS’s trajectory? All you need is a Raspberry Pi, an e-paper display, an enclosure, and a little Python code. Nicola King looks to the skies
Standing on his balcony one sunny evening, the perfect conditions enabled California-based astronomy enthusiast Sridhar Rajagopal to spot the International Space Station speeding by, and the seeds of an idea were duly sown. Having worked on several projects using tri-colour e-paper (aka e-ink) displays, which he likes for their “aesthetics and low-to-no-power consumption”, he thought that developing a way of tracking the ISS using such a display would be a perfect project to undertake.
“After a bit of searching, I was able to find an open API to get the ISS location at any given point in time,” explains Sridhar. I also knew I wouldn’t have to worry about the data changing several times per second or even per minute. Even though the ISS is wicked fast (16 orbits in a day!), this would still be well within the refresh capabilities of the e-paper display.”
His ISS Tracker works by obtaining the ISS location from the Open Notify API every 30 seconds. It appends this data point to a list, so older data is available. “I don’t currently log the data to file, but it would be very easy to add this functionality,” says Sridhar. “Once I have appended the data to the list, I call the drawISS method of my Display class with the positions array, to render the world map and ISS trajectory and current location. The world map gets rendered to one PIL image, and the ISS location and trajectory get rendered to another PIL image.”
Each latitude/longitude position is mapped to the corresponding XY co-ordinate. The last position in the array (the latest position) gets rendered as the ISS icon to show its current position. “Every 30th data point gets rendered as a rectangle, and every other data point gets rendered as a tiny circle,” adds Sridhar.
From there, the images are then simply passed into the e-paper library’s display method; one image is rendered in black, and the other image in red.
Little wonder that the response received from friends, family, and the wider maker community has been extremely positive, as Sridhar shares: “The first feedback was from my non-techie wife who love-love-loved the idea of displaying the ISS location and trajectory on the e-paper display. She gave valuable input on the aesthetics of the data visualisation.”
In addition, he tells us that other makers have contributed suggestions for improvements. “JP, a Hackster community user […] added information to make the Python code a service and have it launch on bootup. I had him contribute his changes to my GitHub repository – I was thrilled about the community involvement!”
Housed in a versatile, transparent ProtoStax enclosure designed by Sridhar, the end result is an elegant way of showing the current position and trajectory of the ISS as it hurtles around the Earth at 7.6 km/s. Why not have a go at making your own display so you know when to look out for the space station whizzing across the night sky? It really is an awesome sight.
The team at Raspberry Pi and our partner ESA Education are pleased to announce the winning and highly commended Mission Space Lab teams of the 2019–20 European Astro Pi Challenge!
Mission Space Lab sees teams of young people across Europe design, create, and deploy experiments running on Astro Pi computers aboard the International Space Station. Their final task: analysing the experiments’ results and sending us scientific reports highlighting their methods, results, and conclusions.
The science teams performed was truly impressive, and the reports teams sent us were of outstanding quality. A special round of applause to the teams for making the effort to coordinate writing their reports socially distant!
The Astro Pi jury has now selected the ten winning teams, as well as eight highly commended teams:
And our winners are…
Vidhya’s code from the UK aimed to answer the question of how a compass works on the ISS, using the Astro Pi computer’s magnetometer and data from the World Magnetic Model (WMM).
Unknown from Externato Cooperativo da Benedita, Portugal, aptly investigated whether influenza is transmissible on a spacecraft such as the ISS, using the Astro Pi hardware alongside a deep literature review.
Space Wombats from Institut d’Altafulla, Spain, used normalized difference vegetation index (NDVI) analysis to identify burn scars from forest fires. They even managed to get results over Chernobyl!
Liberté from Catmose College, UK, set out to prove the Coriolis Effect by using Sobel filtering methods to identify the movement and direction of clouds.
Pardubice Pi from SPŠE a VOŠ Pardubice, Czech Republic, found areas of enormous vegetation loss by performing NDVI analysis on images taken from the Astro Pi and comparing this with historic images of the location.
Reforesting Entrepreneurs from Canterbury School of Gran Canaria, Spain, want to help solve the climate crisis by using NDVI analysis to identify locations where reforestation is possible.
1G5-Boys from Lycée Raynouard, France, innovatively conducted spectral analysis using Fast Fourier Transforms to study low-frequency vibrations of the ISS.
Cloud4 from Escola Secundária de Maria, Portugal, masterfully used a simplified static model and Fourier Analysis to detect atmospheric gravity waves (AGWs).
Cloud Wizzards from Primary School no. 48, Poland, scanned the sky to determine what percentage of the seas and oceans are covered by clouds.
Aguere Team 1 from IES Marina Cebrián, Spain, probed the behaviour of the magnetic field, acceleration, and temperature on the ISS by investigating disturbances, variations with latitude, and temporal changes.
Highly commended teams
Creative Coders, from the UK, decided to see how much of the Earth’s water is stored in clouds by analysing the pixels of each image of Earth their experiment collected.
Astro Jaslo from I Liceum Ogólnokształcące króla Stanisława Leszczyńskiego w Jaśle, Poland, used Reimann geometry to determine the angle between light from the sun that is perpendicular to the Astro Pi camera, and the line segment from the ISS to Earth’s centre.
Jesto from S.M.S Arduino I.C.Ivrea1, Italy, used a multitude of the Astro Pi computers’ capabilities to study NDVI, magnetic fields, and aerosol mapping.
BLOOMERS from Tudor Vianu National Highschool of Computer Science, Romania, investigated how algae blooms are affected by eutrophication in polluted areas.
AstroLorenzini from Liceo Statale C. Lorenzini, Italy used Kepler’s third law to determine the eccentricity, apogee, perigee, and mean tangential velocity of the ISS.
EasyPeasyCoding Verdala FutureAstronauts from Verdala International School & EasyPeasyCoding, Malta, utilised machine learning to differentiate between cloud types.
BHTeamEL from Branksome Hall, Canada, processed images using Y of YCbCr colour mode data to investigate the relationship between cloud type and luminescence.
Space Kludgers from Technology Club of Thrace, STETH, Greece, identified how atmospheric emissions correlate to population density, as well as using NDVI, ECCAD, and SEDAC to analyse the correlation of vegetation health and abundance with anthropogenic emissions.
The teams get a Q&A with astronaut Luca Parmitano
The prize for the winners and highly commended teams is the chance to pose their questions to ESA astronaut Luca Parmitano! The teams have been asked to record a question on video, which Luca will answer during a live stream on 3 September.
This Q&A event for the finalists will conclude this year’s European Astro Pi Challenge. Everyone on the Raspberry Pi and ESA Education teams congratulates this year’s participants on all their efforts.
It’s been a phenomenal year for the Astro Pi challenge: team performed some great science, and across Mission Space Lab and Mission Zero, an astronomical 16998 young people took part, from all ESA member states as well as Slovenia, Canada, and Malta.
Congratulations to everyone who took part!
Get excited for your next challenge!
This year’s European Astro Pi Challenge is almost over, and the next edition is just around the corner!
So we invite school teachers, educators, students, and all young people who love coding and space science to join us from September onwards.
Follow our updates on astro-pi.org and social media to make sure you don’t miss any announcements. We will see you for next year’s European Astro Pi Challenge!
Privacidad y cookies: este sitio utiliza cookies. Al continuar utilizando esta web, aceptas su uso.
Para obtener más información, incluido cómo controlar las cookies, consulta aquí:
Política de cookies