Recreate Time Pilot’s free-scrolling action | Wireframe #41

Fly through the clouds in our re-creation of Konami’s classic 1980s shooter. Mark Vanstone has the code

Arguably one of Konami’s most successful titles, Time Pilot burst into arcades in 1982. Yoshiki Okamoto worked on it secretly, and it proved so successful that a sequel soon followed. In the original, the player flew through five eras, from 1910, 1940, 1970, 1982, and then to the far future: 2001. Aircraft start as biplanes and progress to become UFOs, naturally, by the last level.

Players also rescue other pilots by picking them up as they parachute from their aircraft. The player’s plane stays in the centre of the screen while other game objects move around it. The clouds that give the impression of movement have a parallax style to them, some moving faster than others, offering an illusion of depth.

To make our own version with Pygame Zero, we need eight frames of player aircraft images – one for each direction it can fly. After we create a player Actor object, we can get input from the cursor keys and change the direction the aircraft is pointing with a variable which will be set from zero to 7, zero being the up direction. Before we draw the player to the screen, we set the image of the Actor to the stem image name, plus whatever that direction variable is at the time. That will give us a rotating aircraft.

To provide a sense of movement, we add clouds. We can make a set of random clouds on the screen and move them in the opposite direction to the player aircraft. As we only have eight directions, we can use a lookup table to change the x and y coordinates rather than calculating movement values. When they go off the screen, we can make them reappear on the other side so that we end up with an ‘infinite’ playing area. Add a level variable to the clouds, and we can move them at different speeds on each update() call, producing the parallax effect. Then we need enemies. They will need the same eight frames to move in all directions. For this sample, we will just make one biplane, but more could be made and added.

Our Python homage to Konami’s arcade classic.

To get the enemy plane to fly towards the player, we need a little maths. We use the math.atan2() function to work out the angle between the enemy and the player. We convert that to a direction which we set in the enemy Actor object, and set its image and movement according to that direction variable. We should now have the enemy swooping around the player, but we will also need some bullets. When we create bullets, we need to put them in a list so that we can update each one individually in our update(). When the player hits the fire button, we just need to make a new bullet Actor and append it to the bullets list. We give it a direction (the same as the player Actor) and send it on its way, updating its position in the same way as we have done with the other game objects.

The last thing is to detect bullet hits. We do a quick point collision check and if there’s a match, we create an explosion Actor and respawn the enemy somewhere else. For this sample, we haven’t got any housekeeping code to remove old bullet Actors, which ought to be done if you don’t want the list to get really long, but that’s about all you need: you have yourself a Time Pilot clone!

Here’s Mark’s code for a Time Pilot-style free-scrolling shooter. To get it running on your system, you’ll need to install Pygame Zero. And to download the full code and assets, head here.

Get your copy of Wireframe issue 41

You can read more features like this one in Wireframe issue 41, available directly from Raspberry Pi Press — we deliver worldwide.

And if you’d like a handy digital version of the magazine, you can also download issue 41 for free in PDF format.

Make sure to follow Wireframe on Twitter and Facebook for updates and exclusive offers and giveaways. Subscribe on the Wireframe website to save up to 49% compared to newsstand pricing!

The post Recreate Time Pilot’s free-scrolling action | Wireframe #41 appeared first on Raspberry Pi.

Noticia Original

Raspberry Pi keyboards for Japan are here!

When we announced new keyboards for Portugal and the Nordic countries last month, we promised that you wouldn’t have to wait much longer for a variant for Japan, and now it’s here!

Japanese Raspberry Pi keyboard

The Japan variant of the Raspberry Pi keyboard required a whole new moulding set to cover the 83-key arrangement of the keys. It’s quite a complex keyboard, with three different character sets to deal with. Figuring out how the USB keyboard controller maps to all the special keys on a Japanese keyboard was particularly challenging, with most web searches leading to non-English websites. Since I don’t read Japanese, it all became rather bewildering.

We ended up reverse-engineering generic Japanese keyboards to see how they work, and mapping the keycodes to key matrix locations. We are fortunate that we have a very patient keyboard IC vendor, called Holtek, which produces the custom firmware for the controller.

We then had to get these prototypes to our contacts in Japan, who told us which keys worked and which just produced a strange squiggle that they didn’t understand either. The “Yen” key was particularly difficult because many non-Japanese computers read it as a “/” character, no matter what we tried to make it work.

Special thanks are due to Kuan-Hsi Ho of Holtek, to Satoka Fujita for helping me test the prototypes, and to Matsumoto Seiya for also testing units and checking the translation of the packaging.

Get yours today

You can get the new Japanese keyboard variant in red/white from our Approved Reseller, SwitchScience, based in Japan.

If you’d rather your keyboard in black/grey, you can purchase it from Pimoroni and The Pi Hut in the UK, who both offer international shipping.

The post Raspberry Pi keyboards for Japan are here! appeared first on Raspberry Pi.

Noticia Original

DSLR Motion Capture with Raspberry Pi and OpenCV

One of our favourite makers, Pi & Chips (AKA David Pride), wanted to see if they could trigger a DSLR camera to take pictures by using motion detection with OpenCV on Raspberry Pi.

You could certainly do this with a Raspberry Pi High Quality Camera, but David wanted to try with his swanky new Lumix camera. As well as a Raspberry Pi and whichever camera you’re using, you’ll also need a remote control. David sourced a cheap one from Amazon, since he knew full well he was going to be… breaking it a bit.

Breaking the remote a bit

When it came to the “breaking” part, David explains: “I was hoping to be able to just re-solder some connectors to the button but it was a dual function button depending on depth of press. I therefore got a set of probes out and traced which pins on the chip were responsible for the actual shutter release and then *carefully* managed to add two fine wires.”

Further breaking

Next, David added Dupont cables to the ends of the wires to allow access to the breadboard, holding the cables in place with a blob of hot glue. Then a very simple circuit using an NPN transistor to switch via GPIO gave remote control of the camera from Python.

Raspberry Pi on the right, working together with the remote control’s innards on the left

David then added OpenCV to the mix, using this tutorial on PyImageSearch. He took the basic motion detection script and added a tiny hack to trigger the GPIO when motion was detected.

He needed to add a delay to the start of the script so he could position stuff, or himself, in front of the camera with time to spare. Got to think of those angles.

David concludes: “The camera was set to fully manual and to a really nice fast shutter speed. There is almost no delay at all between motion being detected and the Lumix actually taking pictures, I was really surprised how instantaneous it was.”

The whole setup mounted on a tripod ready to play

Here are some of the visuals captured by this Raspberry Pi-powered project…

Take a look at some more of David’s projects over at Pi & Chips.

The post DSLR Motion Capture with Raspberry Pi and OpenCV appeared first on Raspberry Pi.

Noticia Original

Raspberry Pi won’t let your watched pot boil

One of our favourite YouTubers, Harrison McIntyre, decided to make the aphorism “a watched pot never boils” into reality. They modified a tabletop burner with a Raspberry Pi so that it will turn itself off if anyone looks at it.

In this project, the Raspberry Pi runs facial detection using a USB camera. If the Raspberry Pi finds a face, it deactivates the burner, and vice versa.

There’s a snag, in that the burner runs off 120 V AC and the Raspberry Pi runs off 5 V DC, so you can’t just power the burner through the Raspberry Pi. Harrison got round this problem using a relay switch, and beautifully explains how a relay manages to turn a circuit off and on without directly interfacing with the circuit at the two minute mark of this video.

The Raspberry Pi working through the switchable plug with the burner

Harrison sourced a switchable plug bar which uses a relay to turn its own switches on and off. Plug the burner and the Raspberry Pi into that and, hey presto, you’ve got them working together via a relay.

The six camera setup

Things get jazzy at the four minute 30 second mark. At this point, Harrison decides to upgrade his single camera situation, and rig up six USB cameras to make sure that no matter where you are when you you look at the burner, the Raspberry Pi will always see your face and switch it off.

Inside the switchable plug

Harrison’s multiple-camera setup proved a little much for the Raspberry Pi 3B he had to hand for this project, so he goes on to explain how he got a bit of extra processing power using a different desktop and an Arduino. He recommends going for a Raspberry Pi 4 if you want to try this at home.

Kit list:

  • Raspberry Pi 4
  • Tabletop burner
  • USB cameras or rotating camera
  • Switchable plug bar
  • All of this software
It’s not just a saying anymore, thanks to Harrison

And the last great thing about this project is that you could invert the process to create a safety mechanism, meaning you wouldn’t be able to wander away from your cooking and leave things to burn.

We also endorse Harrison’s advice to try this with an electric burner and most definitely not a gas one; those things like to go boom if you don’t play with them properly.

The post Raspberry Pi won’t let your watched pot boil appeared first on Raspberry Pi.

Noticia Original

App note: Tutorial selecting the optimum voltage reference

App note from Maxim Integrated guiding you when selecting voltage references. Link here

What could be more basic than a voltage reference – a simple, constant reference voltage? As with all design topics, there are tradeoffs. This article discusses the different types of voltage references, their key specifications, and the design tradeoffs, including accuracy, temperature-independence, current drive capability, power dissipation, stability, noise, and cost.

from Dangerous Prototypes https://ift.tt/2Dfq3ki

App note: Voltage Reference Application and Design Note

App note from Renesas on the basics of voltage reference. Link here (PDF)

Conceptually, a voltage reference is a very simple device with only one purpose in its life. Quite simply, the purpose of a voltage reference is to generate an exact output voltage no matter what happens with respect to its operating voltage, load current, temperature changes or the passage of time.

from Dangerous Prototypes https://ift.tt/2ECZaHn

Design game graphics with Digital Making at Home

Join us for Digital Making at Home: this week, young people can explore the graphics side of video game design! Through Digital Making at Home, we invite kids all over the world to code along with us and our new videos every week.

So get ready to design video game graphics with us:

Check out this week’s code-along projects!

And tune in on Wednesday 2pm BST / 9am EDT / 7.30pm IST at rpf.io/home to code along with our live stream session to make a Space Invaders–style shooter game in Scratch!

The post Design game graphics with Digital Making at Home appeared first on Raspberry Pi.

Noticia Original

International Space Station Tracker | The MagPi 96

Fancy tracking the ISS’s trajectory? All you need is a Raspberry Pi, an e-paper display, an enclosure, and a little Python code. Nicola King looks to the skies

The e-paper display mid-refresh. It takes about three seconds to refresh, but it’s fast enough for this kind of project

Standing on his balcony one sunny evening, the perfect conditions enabled California-based astronomy enthusiast Sridhar Rajagopal to spot the International Space Station speeding by, and the seeds of an idea were duly sown. Having worked on several projects using tri-colour e-paper (aka e-ink) displays, which he likes for their “aesthetics and low-to-no-power consumption”, he thought that developing a way of tracking the ISS using such a display would be a perfect project to undertake.

“After a bit of searching, I was able to find an open API to get the ISS location at any given point in time,” explains Sridhar. I also knew I wouldn’t have to worry about the data changing several times per second or even per minute. Even though the ISS is wicked fast (16 orbits in a day!), this would still be well within the refresh capabilities of the e-paper display.”

The ISS location data is obtained using the Open Notify API – visit magpi.cc/isslocation to see its current position

Station location

His ISS Tracker works by obtaining the ISS location from the Open Notify API every 30 seconds. It appends this data point to a list, so older data is available. “I don’t currently log the data to file, but it would be very easy to add this functionality,” says Sridhar. “Once I have appended the data to the list, I call the drawISS method of my Display class with the positions array, to render the world map and ISS trajectory and current location. The world map gets rendered to one PIL image, and the ISS location and trajectory get rendered to another PIL image.”

The project code is written in Python and can be found on Sridhar’s GitHub
page: magpi.cc/isstrackercode

Each latitude/longitude position is mapped to the corresponding XY co-ordinate. The last position in the array (the latest position) gets rendered as the ISS icon to show its current position. “Every 30th data point gets rendered as a rectangle, and every other data point gets rendered as a tiny circle,” adds Sridhar.

From there, the images are then simply passed into the e-paper library’s display method; one image is rendered in black, and the other image in red.

Track… star

Little wonder that the response received from friends, family, and the wider maker community has been extremely positive, as Sridhar shares: “The first feedback was from my non-techie wife who love-love-loved the idea of displaying the ISS location and trajectory on the e-paper display. She gave valuable input on the aesthetics of the data visualisation.”

Software engineer turned hardwarehacking enthusiast and entrepreneur, Sridhar Rajagopal is the founder of Upbeat Labs and creator of ProtoStax – a maker-friendly stackable, modular,
and extensible enclosure system.

In addition, he tells us that other makers have contributed suggestions for improvements. “JP, a Hackster community user […] added information to make the Python code a service and have it launch on bootup. I had him contribute his changes to my GitHub repository – I was thrilled about the community involvement!”

Housed in a versatile, transparent ProtoStax enclosure designed by Sridhar, the end result is an elegant way of showing the current position and trajectory of the ISS as it hurtles around the Earth at 7.6 km/s. Why not have a go at making your own display so you know when to look out for the space station whizzing across the night sky? It really is an awesome sight.

Get The MagPi magazine issue 96 — out today

The MagPi magazine is out now, available in print from the Raspberry Pi Press online store, your local newsagents, and the Raspberry Pi Store, Cambridge.

You can also download the directly from PDF from the MagPi magazine website.

Subscribers to the MagPi for 12 months to get a free Adafruit Circuit Playground, or choose from one of our other subscription offers, including this amazing limited-time offer of three issues and a book for only £10!

The post International Space Station Tracker | The MagPi 96 appeared first on Raspberry Pi.

Noticia Original

DIY OpenDPS power supply

Evan’s DIY OpenDPS power supply:

Years ago I heard about the OpenDPS project to give open source firmware to cheap and available chinese power supplies. These aren’t strictly whole power supplies, they are configurable CC and CV buck converters. That means that it needs a stable DC source to back it to be used as a bench power supply. Perhaps you may not want to do this if you intend to use the DPS as a battery charger run from a solar supply or something, but most people I see want to use them for bench supplies so that requires an existing DC supply. Today I finally finished mine.

from Dangerous Prototypes https://ift.tt/3gkdDpK

Amazing science from the winners of Astro Pi Mission Space Lab 2019–20

The team at Raspberry Pi and our partner ESA Education are pleased to announce the winning and highly commended Mission Space Lab teams of the 2019–20 European Astro Pi Challenge!

Astro Pi Mission Space Lab logo

Mission Space Lab sees teams of young people across Europe design, create, and deploy experiments running on Astro Pi computers aboard the International Space Station. Their final task: analysing the experiments’ results and sending us scientific reports highlighting their methods, results, and conclusions.

One of the Astro Pi computers aboard the International Space Station
One of the Astro Pi computers aboard the International Space Station

The science teams performed was truly impressive, and the reports teams sent us were of outstanding quality. A special round of applause to the teams for making the effort to coordinate writing their reports socially distant!

The Astro Pi jury has now selected the ten winning teams, as well as eight highly commended teams:

And our winners are…

Vidhya’s code from the UK aimed to answer the question of how a compass works on the ISS, using the Astro Pi computer’s magnetometer and data from the World Magnetic Model (WMM).

Unknown from Externato Cooperativo da Benedita, Portugal, aptly investigated whether influenza is transmissible on a spacecraft such as the ISS, using the Astro Pi hardware alongside a deep literature review.

Space Wombats from Institut d’Altafulla, Spain, used normalized difference vegetation index (NDVI) analysis to identify burn scars from forest fires. They even managed to get results over Chernobyl!

Liberté from Catmose College, UK, set out to prove the Coriolis Effect by using Sobel filtering methods to identify the movement and direction of clouds.

Pardubice Pi from SPŠE a VOŠ Pardubice, Czech Republic, found areas of enormous vegetation loss by performing NDVI analysis on images taken from the Astro Pi and comparing this with historic images of the location.

NDVI conversion image by Pardubice Pi team – Astro Pi Mission Space Lab experiment
NDVI conversion image by Pardubice Pi team

Reforesting Entrepreneurs from Canterbury School of Gran Canaria, Spain, want to help solve the climate crisis by using NDVI analysis to identify locations where reforestation is possible.

1G5-Boys from Lycée Raynouard, France, innovatively conducted spectral analysis using Fast Fourier Transforms to study low-frequency vibrations of the ISS.

Cloud4 from Escola Secundária de Maria, Portugal, masterfully used a simplified static model and Fourier Analysis to detect atmospheric gravity waves (AGWs).

Cloud Wizzards from Primary School no. 48, Poland, scanned the sky to determine what percentage of the seas and oceans are covered by clouds.

Aguere Team 1 from IES Marina Cebrián, Spain, probed the behaviour of the magnetic field, acceleration, and temperature on the ISS by investigating disturbances, variations with latitude, and temporal changes.

Highly commended teams

Creative Coders, from the UK, decided to see how much of the Earth’s water is stored in clouds by analysing the pixels of each image of Earth their experiment collected.

Astro Jaslo from I Liceum Ogólnokształcące króla Stanisława Leszczyńskiego w Jaśle, Poland, used Reimann geometry to determine the angle between light from the sun that is perpendicular to the Astro Pi camera, and the line segment from the ISS to Earth’s centre.

Jesto from S.M.S Arduino I.C.Ivrea1, Italy, used a multitude of the Astro Pi computers’ capabilities to study NDVI, magnetic fields, and aerosol mapping.

BLOOMERS from Tudor Vianu National Highschool of Computer Science, Romania, investigated how algae blooms are affected by eutrophication in polluted areas.

AstroLorenzini from Liceo Statale C. Lorenzini, Italy used Kepler’s third law to determine the eccentricity, apogee, perigee, and mean tangential velocity of the ISS.

Photo of Italy, Calabria and Sicilia by AstroLorenzi team — Astro Pi Mission Space Lab experiment
Photo of Italy, Calabria and Sicilia (notice volcano Etna on the top right-hand corner) captured by the AstroLorenzi team

EasyPeasyCoding Verdala FutureAstronauts from Verdala International School & EasyPeasyCoding, Malta, utilised machine learning to differentiate between cloud types.

BHTeamEL from Branksome Hall, Canada, processed images using Y of YCbCr colour mode data to investigate the relationship between cloud type and luminescence.

Space Kludgers from Technology Club of Thrace, STETH, Greece, identified how atmospheric emissions correlate to population density, as well as using NDVI, ECCAD, and SEDAC to analyse the correlation of vegetation health and abundance with anthropogenic emissions.

The teams get a Q&A with astronaut Luca Parmitano

The prize for the winners and highly commended teams is the chance to pose their questions to ESA astronaut Luca Parmitano! The teams have been asked to record a question on video, which Luca will answer during a live stream on 3 September.

ESA astronaut Luca Parmitano aboard the International Space Station
ESA astronaut Luca Parmitano aboard the International Space Station

This Q&A event for the finalists will conclude this year’s European Astro Pi Challenge. Everyone on the Raspberry Pi and ESA Education teams congratulates this year’s participants on all their efforts.

It’s been a phenomenal year for the Astro Pi challenge: team performed some great science, and across Mission Space Lab and Mission Zero, an astronomical 16998 young people took part, from all ESA member states as well as Slovenia, Canada, and Malta.

Congratulations to everyone who took part!

Get excited for your next challenge!

This year’s European Astro Pi Challenge is almost over, and the next edition is just around the corner!

Compilation of photographs of Earth, taken by Astro Pi Izzy aboard the ISS
Compilation of photographs of Earth taken by an Astro Pi computer

So we invite school teachers, educators, students, and all young people who love coding and space science to join us from September onwards.

Follow our updates on astro-pi.org and social media to make sure you don’t miss any announcements. We will see you for next year’s European Astro Pi Challenge!

The post Amazing science from the winners of Astro Pi Mission Space Lab 2019–20 appeared first on Raspberry Pi.

Noticia Original