Remote teams ring office bell with Raspberry Pi and Slack

Bustling offices… remember those? It feels like we’ve all been working from home forever, and it’s going to be a while yet before everyone is back at their desks in the same place. And when that does happen, if your workplace is anything like Raspberry Pi Towers, there will still be lots of people in your team who are based in different countries or have always worked from home.

This office bell, built by a person called Alex, is powered by a Raspberry Pi 3B+ and is linked to Slack, so when a milestone or achievement is announced on the chat platform by a remote team member, they get to experience ringing the office bell for themselves, no matter where in the world they are working from.

Kit list:

Close-up of the servo wired to the Raspberry Pi pins

Integrating with Slack

To get the Raspberry Pi talking to Slack, Alex used the slackclient module (Python 3.6+ only), which makes use of the Slack Real Time Messaging (RTM) API. This is a websocket-based API that allows you to receive events from Slack in real time and send messages as users.

With the Slack RTM API, you create an RTM client and register a callback function that the client executes every time a specific Slack event occurs. When staff tell the @pibot on Slack it’s ‘belltime’, the Raspberry Pi tells the servo to ring the bell in the office.

Alex also configured it to always respond with an emoji reaction when someone successfully rings the bell, so remote employees get some actual feedback that it worked. Here’s the script for that bit.

Alex also figured out how to get around WiFi connectivity drops: they created a cronjob that runs a bash script every 15 minutes to check if the bell ringer is running. If it isn’t running, the bash script starts it.

At the end of Alex’s original post, they’ve concluded that using a HAT would allow for more control of the servo and avoid frying the Raspberry Pi. They also cleaned up their set-up recently and switched the Raspberry Pi 3B+ out for a Raspberry Pi Zero, which is perfectly capable of this simple job.

The post Remote teams ring office bell with Raspberry Pi and Slack appeared first on Raspberry Pi.

Noticia Original

Mini Raspberry Pi Boston Dynamics–inspired robot

This is a ‘Spot Micro’ walking quadruped robot running on Raspberry Pi 3B. By building this project, redditor /thetrueonion (aka Mike) wanted to teach themself robotic software development in C++ and Python, get the robot walking, and master velocity and directional control.

Mike was inspired by Spot, one of Boston Dynamics’ robots developed for industry to perform remote operation and autonomous sensing.

What’s it made of?

  • Raspberry Pi 3B
  • Servo control board: PCA9685, controlled via I2C
  • Servos: 12 × PDI-HV5523MG
  • LCD Panel: 16×2 I2C LCD panel
  • Battery: 2s 4000 mAh LiPo, direct connection to power servos
  • UBEC: HKU5 5V/5A ubec, used as 5V voltage regulator to power Raspberry Pi, LCD panel, PCA9685 control board
  • Thingiverse 3D-printed Spot Micro frame

How does it walk?

The mini ‘Spot Micro’ bot rocks a three-axis angle command/body pose control mode via keyboard and can achieve ‘trot gait’ or ‘walk gait’. The former is a four-phase gait with symmetric motion of two legs at a time (like a horse trotting). The latter is an eight-phase gait with one leg swinging at a time and a body shift in between for balance (like humans walking).

Mike breaks down how they got the robot walking, right down to the order the servos need to be connected to the PCA9685 control board, in this extensive walkthrough.

Here’s the code

And yes, this is one of those magical projects with all the code you need stored on GitHub. The software is implemented on a Raspberry Pi 3B running Ubuntu 16.04. It’s composed on C++ and Python nodes in a ROS framework.

What’s next?

Mike isn’t finished yet: they are looking to improve their yellow beast by incorporating a lidar to achieve simple 2D mapping of a room. Also on the list is developing an autonomous motion-planning module to guide the robot to execute a simple task around a sensed 2D environment. And finally, adding a camera or webcam to conduct basic image classification would finesse their creation.

The post Mini Raspberry Pi Boston Dynamics–inspired robot appeared first on Raspberry Pi.

Noticia Original

Track your punches with Raspberry Pi

‘Track-o-punches’ tracks the number of punches thrown during workouts with Raspberry Pi and a Realsense camera, and it also displays your progress and sets challenges on a touchscreen.

In this video, Cisco shows you how to set up the Realsense camera and a Python virtual environment, and how to install dependencies and OpenCV for Python on your Raspberry Pi.

How it works

A Realsense robotic camera tracks the boxing glove as it enters and leaves the frame. Colour segmentation means the camera can more precisely pick up when Cisco’s white boxing glove is in frame. He walks you through how to threshold images for colour segmentation at this point in the video.

Testing the tracking

All this data is then crunched on Raspberry Pi. Cisco’s code counts the consecutive frames that the segmented object is present; if that number is greater than a threshold, the code sees this as a particular action.

Raspberry Pi 4 being mounted on the Raspberry Pi 7″ Touch Display

Cisco used this data to set punch goals for the user. The Raspberry Pi computer is connected to an official Raspberry Pi 7″ Touch Display in order to display “success” and “fail” messages as well as the countdown clock. Once a goal is reached, the touchscreen tells the boxer that they’ve successfully hit their target. Then the counter resets and a new goal is displayed. You can manipulate the code to set a time limit to reach a punch goal, but setting a countdown timer was the hardest bit to code for Cisco.

Kit list

Jeeeez, it’s hard to get a screen grab of Cisco’s fists of fury

A mobile power source makes it easier to set up a Raspberry Pi wherever you want to work out. Cisco 3D-printed a mount for the Realsense camera and secured it on the ceiling so it could look down on him while he punched.

The post Track your punches with Raspberry Pi appeared first on Raspberry Pi.

Noticia Original

New twist on Raspberry Pi experimental resin 3D printer

Element14’s Clem previously built a giant Raspberry Pi-powered resin-based 3D printer and here, he’s flipped the concept upside down.

The new Raspberry Pi 4 8GB reduces slicing times and makes for a more responsive GUI on this experimental 3D printer. Let’s take a look at what Clem changed and how…

The previous iteration of his build was “huge”, mainly because the only suitable screen Clem had to hand was a big 4K monitor. This new build flips the previous concept upside down by reducing the base size and the amount of resin needed.

Breaking out of the axis

To resize the project effectively, Clem came out of an X,Y axis and into Z, reducing the surface area but still allowing for scaling up, well, upwards! The resized, flipped version of this project also reduces the cost (resin is expensive stuff) and makes the whole thing more portable than a traditional, clunky 3D printer.

Look how slim and portable it is!

How it works

Now for the brains of the thing: nanodlip is free (but not open source) software which Clem ran on a Raspberry Pi 4. Using an 8GB Raspberry Pi will get you faster slicing times, so go big if you can.

A 5V and 12V switch volt power supply sorts out the Nanotec stepper motor. To get the signal from the Raspberry Pi GPIO pins to the stepper driver and to the motor, the pins are configured in nanodlp; Clem has shared his settings if you’d like to copy them (scroll down on this page to find a ‘Resources’ zip file just under the ‘Bill of Materials’ list).

Raspberry Pi working together with the display

For the display, there’s a Midas screen and an official Raspberry Pi 7″ Touchscreen Display, both of which work perfectly with nanodlip.

At 9:15 minutes in to the project video, Clem shows you around Fusion 360 and how he designed, printed, assembled, and tested the build’s engineering.

A bit of Fusion 360

Experimental resin

Now for the fancy, groundbreaking bit: Clem chose very specialised photocentric, high-tensile daylight resin so he can use LEDs with a daylight spectrum. This type of resin also has a lower density, so the liquid does not need to be suspended by surface tension (as in traditional 3D printers), rather it floats because of its own buoyancy. This way, you’ll need less resin to start with, and you’ll waste less too whenever you make a mistake. At 13:30 minutes into the project video, Clem shares the secret of how you achieve an ‘Oversaturated Solution’ in order to get your resin to float.

Now for the science bit…

Materials

It’s not perfect but, if Clem’s happy, we’re happy.

Join the conversation on YouTube if you’ve got an idea that could improve this unique approach to building 3D printers.

The post New twist on Raspberry Pi experimental resin 3D printer appeared first on Raspberry Pi.

Noticia Original

Raspberry Pi calls out your custom workout routine

If you don’t want to be tied to a video screen during home workouts, Llum AcostaSamreen Islam, and Alfred Gonzalez shared this great Raspberry Pi–powered alternative on hackster.io: their voice-activated project announces each move of your workout routine and how long you need to do it for.

This LED-lit, compact solution means you don’t need to squeeze yourself in front of a TV or crane to see what your video instructor is doing next. Instead you can be out in the garden or at a local park and complete your own, personalised workout on your own terms.

Kit list:

Raspberry Pi and MATRIX Device

The makers shared these setup guides to get MATRIX working with your Raspberry Pi. Our tiny computer doesn’t have a built-in microphone, so here’s where the two need to work together.

MATRIX, meet Raspberry Pi

Once that’s set up, ensure you enable SSH on your Raspberry Pi.

Click, click. Simple

The three sweet Hackster angels shared a four-step guide to running the software of your own customisable workout routine buddy in their original post. Happy hacking!

1. Install MATRIX Libraries and Rhasspy

Follow the steps below in order for Rhasspy to work on your Raspberry Pi.

2. Creating an intent

Access Rhasspy’s web interface by opening a browser and navigating to http://YOUR_PI_IP_HERE:12101. Then click on the Sentences tab. All intents and sentences are defined here.

By default, there are a few example sentences in the text box. Remove the default intents and add the following:

[Workout]start [my] workout

Once created, click on Save Sentences and wait for Rhasspy to finish training.

Here, Workout is an intent. You can change the wording to anything that works for you as long as you keep [Workout] the same, because this intent name will be used in the code.

3. Catching the intent

Install git on your Raspberry Pi.

sudo apt install git

Download the repository.

git clone https://github.com/matrix-io/rhasspy-workout-timer

Navigate to the folder and install the project dependencies.

cd rhasspy-workout-timernpm install

Run the program.

node index.js

4. Using and customizing the project

To change the workout to your desired routine, head into the project folder and open workout.txt. There, you’ll see:

jumping jacks 12,plank 15, test 14

To make your own workout routine, type an exercise name followed by the number of seconds to do it for. Repeat that for each exercise you want to do, separating each combo using a comma.

Whenever you want to use the Rhasspy Assistant, run the file and say “Start my workout” (or whatever it is you have it set to).

And now you’re all done — happy working out. Make sure to visit the makers’ original post on hackster.io and give it a like.

The post Raspberry Pi calls out your custom workout routine appeared first on Raspberry Pi.

Noticia Original

Create a stop motion film with Digital Making at Home

Join us for Digital Making at Home: this week, young people can do stop motion and time-lapse animation with us! Through Digital Making at Home, we invite kids all over the world to code along with us and our new videos every week.

So get your Raspberry Pi and Camera Module ready! We’re using them to capture life with code this week:

Check out this week’s code-along projects!

And tune in on Wednesday 2pm BST / 9am EDT / 7.30pm IST at rpf.io/home to code along with our live stream session to make a motion-detecting dance game in Scratch!

The post Create a stop motion film with Digital Making at Home appeared first on Raspberry Pi.

Noticia Original

Processing raw image files from a Raspberry Pi High Quality Camera

When taking photos, most of us simply like to press the shutter button on our cameras and phones so that viewable image is produced almost instantaneously, usually encoded in the well-known JPEG format. However, there are some applications where a little more control over the production of that JPEG is desirable. For instance, you may want more or less de-noising, or you may feel that the colours are not being rendered quite right.

This is where raw (sometimes RAW) files come in. A raw image in this context is a direct capture of the pixels output from the image sensor, with no additional processing. Normally this is in a relatively standard format known as a Bayer image, named after Bryce Bayer who pioneered the technique back in 1974 while working for Kodak. The idea is not to let the on-board hardware ISP (Image Signal Processor) turn the raw Bayer image into a viewable picture, but instead to do it offline with an additional piece of software, often referred to as a raw converter.

A Bayer image records only one colour at each pixel location, in the pattern shown

The raw image is sometimes likened to the old photographic negative, and whilst many camera vendors use their own proprietary formats, the most portable form of raw file is the Digital Negative (or DNG) format, defined by Adobe in 2004. The question at hand is how to obtain DNG files from Raspberry Pi, in such a way that we can process them using our favourite raw converters.

Obtaining a raw image from Raspberry Pi

Many readers will be familiar with the raspistill application, which captures JPEG images from the attached camera. raspistill includes the -r option, which appends all the raw image data to the end of the JPEG file. JPEG viewers will still display the file as normal but ignore the (many megabytes of) raw data tacked on the end. Such a “JPEG+RAW” file can be captured using the terminal command:

raspistill -r -o image.jpg

Unfortunately this JPEG+RAW format is merely what comes out of the camera stack and is not supported by any raw converters. So to make use of it we will have to convert it into a DNG file.

PyDNG

This Python utility converts the Raspberry Pi’s native JPEG+RAW files into DNGs. PyDNG can be installed from github.com/schoolpost/PyDNG, where more complete instructions are available. In brief, we need to perform the following steps:

git clone https://github.com/schoolpost/PyDNG
cd PyDNG
pip3 install src/.  # note that PyDNG requires Python3

PyDNG can be used as part of larger Python scripts, or it can be run stand-alone. Continuing the raspistill example from before, we can enter in a terminal window:

python3 examples/utility.py image.jpg

The resulting DNG file can be processed by a variety of raw converters. Some are free (such as RawTherapee or dcraw, though the latter is no longer officially developed or supported), and there are many well-known proprietary options (Adobe Camera Raw or Lightroom, for instance). Perhaps users will post in the comments any that they feel have given them good results.

White balancing and colour matrices

Now, one of the bugbears of processing Raspberry Pi raw files up to this point has been the problem of getting sensible colours. Previously, the images have been rendered with a sickly green cast, simply because no colour balancing is being done and green is normally the most sensitive colour channel. In fact it’s even worse than this, as the RGB values in the raw image merely reflect the sensitivity of the sensor’s photo-sites to different wavelengths, and do not a priori have more than a general correlation with the colours as perceived by our own eyes. This is where we need white balancing and colour matrices.

Correct white balance multipliers are required if neutral parts of the scene are to look, well, neutral.  We can use raspistills guesstimate of them, found in the JPEG+RAW file (or you can measure your own on a neutral part of the scene, like a grey card). Matrices and look-up tables are then required to convert colour from ‘camera’ space to the final colour space of choice, mostly sRGB or Adobe RGB.

My thanks go to forum contributors Jack Hogan for measuring these colour matrices, and to Csaba Nagy for implementing them in the PyDNG tool. The results speak for themselves.

Results

Previous attempts at raw conversion are on the left; the results using the updated PyDNG are on the right.

DCP files

For those familiar with DNG files, we include links to DCP (DNG Camera Profile) files (warning: binary format). You can try different ones out in raw converters, and we would encourage users to experiment, to perhaps create their own, and to share their results!

  1. This is a basic colour profile baked into PyDNG, and is the one shown in the results above. It’s sufficiently small that we can view it as a JSON file.
  2. This is an improved (and larger) profile involving look-up tables, and aiming for an overall balanced colour rendition.
  3. This is similar to the previous one, but with some adjustments for skin tones and sky colours.

Note, however, that these files come with a few caveats. Specifically:

  • The calibration is only for a single Raspberry Pi High Quality Camera rather than a known average or “typical” module.
  • The illuminants used for the calibration are merely the ones that we had to hand — the D65 lamp in particular appears to be some way off.
  • The calibration only really works when the colour temperature lies between, or not too far from, the two calibration illuminants, approximately 2900K to 6000K in our case.

So there remains room for improvement. Nevertheless, results across a number of modules have shown these parameters to be a significant step forward.

Acknowledgements

My thanks again to Jack Hogan for performing the colour matrix calibration with DCamProf, and to Csaba Nagy for adding these new features to PyDNG.

Further reading

  1. There are many resources explaining how a raw (Bayer) image is converted into a viewable RGB or YUV image, among them Jack’s blog post.
  2. To understand the role of the colour matrices in a DNG file, please refer to the DNG specification. Chapter 6 in particular describes how they are used.

The post Processing raw image files from a Raspberry Pi High Quality Camera appeared first on Raspberry Pi.

Noticia Original

Recreate Time Pilot’s free-scrolling action | Wireframe #41

Fly through the clouds in our re-creation of Konami’s classic 1980s shooter. Mark Vanstone has the code

Arguably one of Konami’s most successful titles, Time Pilot burst into arcades in 1982. Yoshiki Okamoto worked on it secretly, and it proved so successful that a sequel soon followed. In the original, the player flew through five eras, from 1910, 1940, 1970, 1982, and then to the far future: 2001. Aircraft start as biplanes and progress to become UFOs, naturally, by the last level.

Players also rescue other pilots by picking them up as they parachute from their aircraft. The player’s plane stays in the centre of the screen while other game objects move around it. The clouds that give the impression of movement have a parallax style to them, some moving faster than others, offering an illusion of depth.

To make our own version with Pygame Zero, we need eight frames of player aircraft images – one for each direction it can fly. After we create a player Actor object, we can get input from the cursor keys and change the direction the aircraft is pointing with a variable which will be set from zero to 7, zero being the up direction. Before we draw the player to the screen, we set the image of the Actor to the stem image name, plus whatever that direction variable is at the time. That will give us a rotating aircraft.

To provide a sense of movement, we add clouds. We can make a set of random clouds on the screen and move them in the opposite direction to the player aircraft. As we only have eight directions, we can use a lookup table to change the x and y coordinates rather than calculating movement values. When they go off the screen, we can make them reappear on the other side so that we end up with an ‘infinite’ playing area. Add a level variable to the clouds, and we can move them at different speeds on each update() call, producing the parallax effect. Then we need enemies. They will need the same eight frames to move in all directions. For this sample, we will just make one biplane, but more could be made and added.

Our Python homage to Konami’s arcade classic.

To get the enemy plane to fly towards the player, we need a little maths. We use the math.atan2() function to work out the angle between the enemy and the player. We convert that to a direction which we set in the enemy Actor object, and set its image and movement according to that direction variable. We should now have the enemy swooping around the player, but we will also need some bullets. When we create bullets, we need to put them in a list so that we can update each one individually in our update(). When the player hits the fire button, we just need to make a new bullet Actor and append it to the bullets list. We give it a direction (the same as the player Actor) and send it on its way, updating its position in the same way as we have done with the other game objects.

The last thing is to detect bullet hits. We do a quick point collision check and if there’s a match, we create an explosion Actor and respawn the enemy somewhere else. For this sample, we haven’t got any housekeeping code to remove old bullet Actors, which ought to be done if you don’t want the list to get really long, but that’s about all you need: you have yourself a Time Pilot clone!

Here’s Mark’s code for a Time Pilot-style free-scrolling shooter. To get it running on your system, you’ll need to install Pygame Zero. And to download the full code and assets, head here.

Get your copy of Wireframe issue 41

You can read more features like this one in Wireframe issue 41, available directly from Raspberry Pi Press — we deliver worldwide.

And if you’d like a handy digital version of the magazine, you can also download issue 41 for free in PDF format.

Make sure to follow Wireframe on Twitter and Facebook for updates and exclusive offers and giveaways. Subscribe on the Wireframe website to save up to 49% compared to newsstand pricing!

The post Recreate Time Pilot’s free-scrolling action | Wireframe #41 appeared first on Raspberry Pi.

Noticia Original

Raspberry Pi keyboards for Japan are here!

When we announced new keyboards for Portugal and the Nordic countries last month, we promised that you wouldn’t have to wait much longer for a variant for Japan, and now it’s here!

Japanese Raspberry Pi keyboard

The Japan variant of the Raspberry Pi keyboard required a whole new moulding set to cover the 83-key arrangement of the keys. It’s quite a complex keyboard, with three different character sets to deal with. Figuring out how the USB keyboard controller maps to all the special keys on a Japanese keyboard was particularly challenging, with most web searches leading to non-English websites. Since I don’t read Japanese, it all became rather bewildering.

We ended up reverse-engineering generic Japanese keyboards to see how they work, and mapping the keycodes to key matrix locations. We are fortunate that we have a very patient keyboard IC vendor, called Holtek, which produces the custom firmware for the controller.

We then had to get these prototypes to our contacts in Japan, who told us which keys worked and which just produced a strange squiggle that they didn’t understand either. The “Yen” key was particularly difficult because many non-Japanese computers read it as a “/” character, no matter what we tried to make it work.

Special thanks are due to Kuan-Hsi Ho of Holtek, to Satoka Fujita for helping me test the prototypes, and to Matsumoto Seiya for also testing units and checking the translation of the packaging.

Get yours today

You can get the new Japanese keyboard variant in red/white from our Approved Reseller, SwitchScience, based in Japan.

If you’d rather your keyboard in black/grey, you can purchase it from Pimoroni and The Pi Hut in the UK, who both offer international shipping.

The post Raspberry Pi keyboards for Japan are here! appeared first on Raspberry Pi.

Noticia Original

DSLR Motion Capture with Raspberry Pi and OpenCV

One of our favourite makers, Pi & Chips (AKA David Pride), wanted to see if they could trigger a DSLR camera to take pictures by using motion detection with OpenCV on Raspberry Pi.

You could certainly do this with a Raspberry Pi High Quality Camera, but David wanted to try with his swanky new Lumix camera. As well as a Raspberry Pi and whichever camera you’re using, you’ll also need a remote control. David sourced a cheap one from Amazon, since he knew full well he was going to be… breaking it a bit.

Breaking the remote a bit

When it came to the “breaking” part, David explains: “I was hoping to be able to just re-solder some connectors to the button but it was a dual function button depending on depth of press. I therefore got a set of probes out and traced which pins on the chip were responsible for the actual shutter release and then *carefully* managed to add two fine wires.”

Further breaking

Next, David added Dupont cables to the ends of the wires to allow access to the breadboard, holding the cables in place with a blob of hot glue. Then a very simple circuit using an NPN transistor to switch via GPIO gave remote control of the camera from Python.

Raspberry Pi on the right, working together with the remote control’s innards on the left

David then added OpenCV to the mix, using this tutorial on PyImageSearch. He took the basic motion detection script and added a tiny hack to trigger the GPIO when motion was detected.

He needed to add a delay to the start of the script so he could position stuff, or himself, in front of the camera with time to spare. Got to think of those angles.

David concludes: “The camera was set to fully manual and to a really nice fast shutter speed. There is almost no delay at all between motion being detected and the Lumix actually taking pictures, I was really surprised how instantaneous it was.”

The whole setup mounted on a tripod ready to play

Here are some of the visuals captured by this Raspberry Pi-powered project…

Take a look at some more of David’s projects over at Pi & Chips.

The post DSLR Motion Capture with Raspberry Pi and OpenCV appeared first on Raspberry Pi.

Noticia Original