Recently I was browsing through the USGS Earth Explorer website, looking for historic aerials for another project I’m working on. I came across a set of 1965 aerial photos of Funter Bay that I don’t think I’ve seen before.
The photos run in a series of 4, going approximately North to South over the West side of the bay. They clearly show the Thlinket Packing Co cannery (owned by Peter Pan Seafoods at the time).
These are pretty high-resolution photos, I’m only able to post snippets here since the full size images are huge! Other areas of detail include several homes and cabins, like Harold and Mary Hargrave’s place:
You can also see the “neighborhood” where I grew up, Crab Cove. This was before most of the modern houses and cabins were built, some of the structures seen here were later torn down or incorporated into larger buildings. My family’s house would be near the top middle, between the creek and the railroad-like marine slipway
These photos were taken on July 8th, 1965, during the fishing season. The time may have been later in the evening, as you can see several salmon trollers anchored or reversing to anchor:
You can also see various ruins and abandoned relics around the bay, such as the old Keeler Cabin (upper middle):
The wreck of the Nimrod:
And another wreck, likely one of the cannery tender vessels, next to some abandoned fish trap frames:
The original images are far too large for my website, at ~250mb each. You can find them on https://earthexplorer.usgs.gov/ with the following steps:
– Scroll to Funter Bay (or other area of interest). Make sure the area you want to search is centered in the website, as seen below, and click “Use Map”:
Next click “Data Sets”:
Then expand the “Aerial Imagery” section, select “Aerial Photo Single Frames”, and click “Results”:
You should now see a bunch of aerial photos of your selected area. For Funter Bay there are at least 6 pages to scroll through, and some of them are mis-matched photos from different areas.
Since it’s a government site it’s not very user-friendly, but it does have a lot of neat content. They are apparently still adding photos and data as archives are scanned, so it’s worth checking back every few years to see if anything is new!
This fall I went up to Ontario, Canada, to pick up an 18ft diameter geodesic dome! (In the picture above, it’s the same size as the one on the far right).
This was at a former NATO satellite facility near the Diefenbunker, outside Ottawa. The big white dome on top of the building is still in use, so we just got a smaller one. You can see the full story below:
We’re planning to install this at Sandland for use as a radio observatory and general cool looking lawn ornament, probably in the upper field near the fast food playland that we picked up previously.
Huge thanks to Marcus Leech and the Canadian Center for Experimental Radio Astronomy (https://www.ccera.ca/) as well as CSS Building Inc (https://www.cssbuilding.ca/) for hooking us up with this! The CCERA is refurbishing the big satellite antenna at Carp into a radio telescope. You can see more about them here:
They’re all volunteers, and as with most science things, they can always use more funding! If you’d like to help CCERA with a donation, click here: https://gofund.me/29c87cc8 Parts of this video were filmed on location at the Diefenbunker: Canada’s Cold War museum. I’ll probably be doing another follow-up video on the bunker since it’s so big and interesting! You can find more about it here: https://diefenbunker.ca/ Thanks also to @SimpleElectronics and Tim Skinner (https://timpossible.photography/) for helping out with this project!
Stay tuned for more videos about this project, coming soon!
The Saveitforparts website has been around since 2001, when I registered the domain in college. Here’s what it looked like 22 years ago (insert joke about my website being old enough to drink):
The frames… the imagemaps… the 30px gifs… they burn!
The Saveitforparts Channel has been sort of a real(ish) thing since about 2020. I had some content on there before, but it wasn’t really set up as a regular “thing”. After some of the monorail videos got popular, I started doing project videos on YouTube. They were awkward, clumsy, and apparently popular enough to get me 1,000 subscribers pretty quickly.
The channel kind of poked along, getting not-that-many views on average. Occasionally there’d be a spike in viewership as the mysterious YouTube algorithm picked something up and promoted it.
Fast-forward to this spring, and the ever-mysterious algorithm decided to push some of my videos to the front page. Boom, suddenly I had 100k subscribers. I’m still not sure how it happened, and the traffic count died off just as fast as it usually does from one of these spikes. For those who think you rake in the cash at 100k… nope! I think you have to be up in the million subscriber range to quit your real job. What you DO get is every seller on Amazon asking you to review their products. Free samples are fun, but I don’t want to become just a review channel!
Anyway, YouTube sends you a wall plaque when you hit certain milestones, and the meme-y thing to do is mistreat them in some way. I took it easy on mine since I actually want it hanging on the wall! Instead of strapping it to a rocket or a submarine, I strapped it to a satellite dish! I was surprised to find that it had good SWR at the frequency range used by military SATCOM systems. Since these are mostly taken over by Brazilian truck drivers, it meant I could listen to CB traffic in Portuguese. Since I don’t speak Portuguese, this is just as entertaining and useless as it sounds.
Anyway, that’s all I have on this Youtube Thing at the moment. I’ll keep cross-spamming my interesting videos over here, and as always you can find them all here.
Schell’s Brewery in New Ulm MN is one of the oldest (actually the 2nd oldest!) family owned breweries in the US. They’re also the home of Grain Belt beer, a favorite drink and occasional sponsor of my shenanigans. Since most breweries in Minnesota included sandstone caves, I’ve always been curious what’s under Schell’s campus. A couple weeks ago I got the opportunity to find out!
Sadly, there’s not as much left of the original cooling / lagering caves as I had hoped. Brewery expansions over the years have sealed off, filled in, and otherwise lost some of the cave system. However, the section that’s left is actually still used for making beer! That’s pretty unique among local breweries.
Thanks to Schell’s and their fine PR staff for getting us down there to check it out!
I’ve been shopping at Ax-Man Surplus stores for years, ever since I moved to the Twin Cities. They’re just the right mix of odd junk, industrial surplus, crafting, tools, materials, electronics, and funny signs and labels! I’ve often joked that anything I didn’t find in the trash came from Ax-Man.
Recently (ish), I was able to tour the store for my YouTube channel! This way, anyone who has the misfortune of living outside the Twin Cities can experience the wonders of Ax-Man for themselves!
When I built a DIY microwave imager earlier this year, I left some of my code unfinished. The high-resolution option seemed a little tricky at the time, since it used an unreliable and little-documented feature of the Dish Tailgater known as a “nudge”. This command, sent over USB serial connection or from a set-top box, would run the brushed motors in the antenna for just a second, pushing the antenna slightly closer (hopefully) to the best signal. Each azimuth nudge is approximately 0.2 of a compass degree, although as I found out later, this wasn’t the case for elevation.
Obviously, I wanted to improve my original low-resolution scan (seen above), that showed geostationary TV satellites in the Clarke Belt. In the above image, each pixel represents one degree of azimuth and one degree of elevation. Panning the dish back and forth through 180 degrees of Southern sky took a whopping 3+ hours to complete. Due to a quirk of the antenna programming, that was the absolute fastest I could make it run. And even that took some fancy handling of the signal data being returned by the serial terminal.
Below is a close-up of the inset box seen above. This is still using the low-res code, where each colored square is one degree wide and tall:
Originally I had the dish scanning back and forth in alternating directions. This was (slightly) faster, and looked cooler, than having it return to the origin azimuth for each elevation. However, I had ongoing issues with gear meshing (switching from clockwise to counterclockwise had some slack or play in the motor). I also had issues with my indexing that never quite went away no matter how I massaged the python data array or bitmap. Making things worse, the “nudge” motor runs aren’t consistent in each direction. Clockwise nudges are a different amount of antenna travel than counter-clockwise, so the image slowly drifted off at an angle. Commenters on Youtube and Github kept suggesting I ditch the alternating scan and just go in one direction each time. At the expense of my cool looking dish motion, I finally gave in and did that. The result is that high-resolution now works!
I wasn’t certain this would work at all, even after getting the motor movements to cooperate. For one thing, the beamwidth of this little 14″ dish is more than a degree, so I was worried that 0.2-degree movements would just give me a mess of noise and artifacts. It turns out that the smaller movements do get a better picture, although you can still see some fuzz and reflections around each satellite transponder source.
Another issue is that (as mentioned before), this Tailgater satellite dish isn’t designed to do any of this. I’m running the motors nearly constantly for hours at a time, when the typical TV-watcher-on-the-go would only run them for a few minutes and then leave it alone until they moved their RV / campsite / fish house. I noticed the more I ran the dish, the more horizontal bands and artifacts showed up in the high res scan. I also started hearing squeaking and grinding noises from the antenna as the poor overworked motor struggled to act as a radio telescope. I took the antenna apart and shot some silicone spray into the worst areas, but eventually I’ll probably ruin the thing!
Keep in mind that the high-res code is 5x more detailed in the x direction and 3x more in the y direction. so it will run 15x slower than the low-res version!
There are also some pull requests from people with suggestions to improve my code, which I have been shamefully ignoring since I don’t understand them and haven’t had time to test them out. If you’re better at Python coding than I am, feel free to poke around and make this better!
Who knew “portable” satellite dishes were a thing? Sure there are some 90’s and 2000s versions like the Dish Tailgater models I’ve been experimenting with, but did you know there were fold-up C-band dishes from the 80s?
I’ve also opened up a few of Tailgater’s competition, the Winegard brand dish. These seem to have a little different construction, using stepper motors instead of brushed motors.
I’ll no doubt have some future projects with some of these, so stay tuned!
I’ve dabbled in radiotelescopes before, mostly as a way to use old TV satellite dishes. However, this time I took a satellite dish and turned it into a microwave “camera”, able to create images in the Ku band!
The dish I’m using is a “Tailgater” model, which is another gadget I’ve experimented with before. The particular model I’m using has a USB console port, allowing serial commands to be sent from a Linux or Windows PC. I was able to automate the motor and receiver commands, driving the dish through a set of azimuth and elevation positions while recording the signal strength.
If you’re planning to do this yourself, these dishes can often be found on Craigslist / Facebook for $0-$50. I think I’ve paid an average of $20 each for four of the things so far. The hard part is finding exactly the right one, as there are various models, revisions, and brand names (VuQube, King, and Winegard are some). Some only have RJ11 control ports, which I haven’t experimented with. The one I’m currently using is from 2014 ,has a square-ish case, and you’ll have to unscrew the top and see if it has a USB “A” port. I’ve also encountered one with a Mini USB port, but couldn’t get that one to work. Update: I dried out the damp Mini-USB version and got a serial console with USB cable. The motors are still seized up, but it seems to be much the same as the USB “A” version.The firmware is from 2011 and doesn’t have an “elangle” command, but changing that to “elev” and changing the numbering range in the python code should theoretically work.
This method is great for imaging a room or building, but where it really shines is for detecting and mapping satellites in geosynchronous orbit. These are, after all, the transmitters this dish is designed for. By panning the dish across the Clarke Belt, I can generate a Ku band image of the radio beacons on these satellites.
These images can be overlaid on panoramic photos to show relative satellite locations. This is a great way to troubleshoot a dish installation (for TV, hobbyist, or other use). You can instantly see which satellites are blocked by trees, or which are drowned out by their outer space neighbors. For example, poor little Galaxy 13 in the 127W orbital slot is barely visible as a dim smudge between the high-power beacons of Galaxy 12 and DirecTV 8. No wonder I had so much trouble picking up PBS on my other dish!
I’m not the first to image satellites like this. Professor James Aguirre of the University of Pennsylvania has a Mini Radio Telescope project on Github. The Thought Emporium on Youtube has a similar project. However, both these approaches require more custom hardware and software for antenna aiming. My method just needs an old $20 Tailgater antenna from Craigslist!
This system can also be used to track down RF leaks. Here’s an indoor scan of my office, overlaid on a panoramic photo of the room. A microwave leak can immediately be seen coming from my poorly-shielded computer tower in the lower right.
If you happen to have one of these Tailgater dishes and want to play around with microwave imaging, check out my Python code at Github. If you’re better at Python that I am, you can probably improve things a bit for your own use 🙂
A take on this project, my version uses a knock-off foam dart blaster. Since the original code is from 2015 and uses some older version of Python and OpenCV, I had to modify a few things. More details below.
I made the following modifications: – Installed Python 3 instead of Python 2. – Installed OpenCV 4 instead of 3. – Skipped the whole virtual environment thing, couldn’t get it working and didn’t seem necessary.
I edited turret.py as follows: – Changed the print statements in turret.py to use Python3 style parentheses – Changed “import thread” to import _thread in line 7 – Changed “thread” to “_thread” in line 423 – Removed “lm,” from line 158 -Changed print syntax on lines 283 and 284 to have the closing parentheses at the end
My modified version of turret.py can be found below:
### Original turret.py file by HackerShack
### from https://github.com/HackerShackOfficial/Tracking-Turret
### Modified 2/16/2022 by Gabe Emerson to work with Python 3
try:
import cv2
except Exception as e:
print("Warning: OpenCV not installed. To use motion detection, make sure you've properly configured OpenCV.")
import time
import _thread
import threading
import atexit
import sys
import termios
import contextlib
import imutils
import RPi.GPIO as GPIO
from Adafruit_MotorHAT import Adafruit_MotorHAT, Adafruit_DCMotor, Adafruit_StepperMotor
### User Parameters ###
MOTOR_X_REVERSED = False
MOTOR_Y_REVERSED = False
MAX_STEPS_X = 30
MAX_STEPS_Y = 15
RELAY_PIN = 22
#######################
@contextlib.contextmanager
def raw_mode(file):
"""
Magic function that allows key presses.
:param file:
:return:
"""
old_attrs = termios.tcgetattr(file.fileno())
new_attrs = old_attrs[:]
new_attrs[3] = new_attrs[3] & ~(termios.ECHO | termios.ICANON)
try:
termios.tcsetattr(file.fileno(), termios.TCSADRAIN, new_attrs)
yield
finally:
termios.tcsetattr(file.fileno(), termios.TCSADRAIN, old_attrs)
class VideoUtils(object):
"""
Helper functions for video utilities.
"""
@staticmethod
def live_video(camera_port=0):
"""
Opens a window with live video.
:param camera:
:return:
"""
video_capture = cv2.VideoCapture(camera_port)
while True:
# Capture frame-by-frame
ret, frame = video_capture.read()
# Display the resulting frame
cv2.imshow('Video', frame)
if cv2.waitKey(1) & 0xFF == ord('q'):
break
# When everything is done, release the capture
video_capture.release()
cv2.destroyAllWindows()
@staticmethod
def find_motion(callback, camera_port=0, show_video=False):
camera = cv2.VideoCapture(camera_port)
time.sleep(0.25)
# initialize the first frame in the video stream
firstFrame = None
tempFrame = None
count = 0
# loop over the frames of the video
while True:
# grab the current frame and initialize the occupied/unoccupied
# text
(grabbed, frame) = camera.read()
# if the frame could not be grabbed, then we have reached the end
# of the video
if not grabbed:
break
# resize the frame, convert it to grayscale, and blur it
frame = imutils.resize(frame, width=500)
gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)
gray = cv2.GaussianBlur(gray, (21, 21), 0)
# if the first frame is None, initialize it
if firstFrame is None:
print ("Waiting for video to adjust...")
if tempFrame is None:
tempFrame = gray
continue
else:
delta = cv2.absdiff(tempFrame, gray)
tempFrame = gray
tst = cv2.threshold(delta, 5, 255, cv2.THRESH_BINARY)[1]
tst = cv2.dilate(tst, None, iterations=2)
if count > 30:
print ("Done.\n Waiting for motion.")
if not cv2.countNonZero(tst) > 0:
firstFrame = gray
else:
continue
else:
count += 1
continue
# compute the absolute difference between the current frame and
# first frame
frameDelta = cv2.absdiff(firstFrame, gray)
thresh = cv2.threshold(frameDelta, 25, 255, cv2.THRESH_BINARY)[1]
# dilate the thresholded image to fill in holes, then find contours
# on thresholded image
thresh = cv2.dilate(thresh, None, iterations=2)
c = VideoUtils.get_best_contour(thresh.copy(), 5000)
if c is not None:
# compute the bounding box for the contour, draw it on the frame,
# and update the text
(x, y, w, h) = cv2.boundingRect(c)
cv2.rectangle(frame, (x, y), (x + w, y + h), (0, 255, 0), 2)
callback(c, frame)
# show the frame and record if the user presses a key
if show_video:
cv2.imshow("Security Feed", frame)
key = cv2.waitKey(1) & 0xFF
# if the `q` key is pressed, break from the lop
if key == ord("q"):
break
# cleanup the camera and close any open windows
camera.release()
cv2.destroyAllWindows()
@staticmethod
def get_best_contour(imgmask, threshold):
contours, hierarchy = cv2.findContours(imgmask, cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)
best_area = threshold
best_cnt = None
for cnt in contours:
area = cv2.contourArea(cnt)
if area > best_area:
best_area = area
best_cnt = cnt
return best_cnt
class Turret(object):
"""
Class used for turret control.
"""
def __init__(self, friendly_mode=True):
self.friendly_mode = friendly_mode
# create a default object, no changes to I2C address or frequency
self.mh = Adafruit_MotorHAT()
atexit.register(self.__turn_off_motors)
# Stepper motor 1
self.sm_x = self.mh.getStepper(200, 1) # 200 steps/rev, motor port #1
self.sm_x.setSpeed(5) # 5 RPM
self.current_x_steps = 0
# Stepper motor 2
self.sm_y = self.mh.getStepper(200, 2) # 200 steps/rev, motor port #2
self.sm_y.setSpeed(5) # 5 RPM
self.current_y_steps = 0
# Relay
GPIO.setmode(GPIO.BCM)
GPIO.setup(RELAY_PIN, GPIO.OUT)
GPIO.output(RELAY_PIN, GPIO.LOW)
def calibrate(self):
"""
Waits for input to calibrate the turret's axis
:return:
"""
print ("Please calibrate the tilt of the gun so that it is level. Commands: (w) moves up, " \
"(s) moves down. Press (enter) to finish.\n")
self.__calibrate_y_axis()
print ("Please calibrate the yaw of the gun so that it aligns with the camera. Commands: (a) moves left, " \
"(d) moves right. Press (enter) to finish.\n")
self.__calibrate_x_axis()
print ("Calibration finished.")
def __calibrate_x_axis(self):
"""
Waits for input to calibrate the x axis
:return:
"""
with raw_mode(sys.stdin):
try:
while True:
ch = sys.stdin.read(1)
if not ch:
break
elif ch == "a":
if MOTOR_X_REVERSED:
Turret.move_backward(self.sm_x, 5)
else:
Turret.move_forward(self.sm_x, 5)
elif ch == "d":
if MOTOR_X_REVERSED:
Turret.move_forward(self.sm_x, 5)
else:
Turret.move_backward(self.sm_x, 5)
elif ch == "\n":
break
except (KeyboardInterrupt, EOFError):
print ("Error: Unable to calibrate turret. Exiting...")
sys.exit(1)
def __calibrate_y_axis(self):
"""
Waits for input to calibrate the y axis.
:return:
"""
with raw_mode(sys.stdin):
try:
while True:
ch = sys.stdin.read(1)
if not ch:
break
if ch == "w":
if MOTOR_Y_REVERSED:
Turret.move_forward(self.sm_y, 5)
else:
Turret.move_backward(self.sm_y, 5)
elif ch == "s":
if MOTOR_Y_REVERSED:
Turret.move_backward(self.sm_y, 5)
else:
Turret.move_forward(self.sm_y, 5)
elif ch == "\n":
break
except (KeyboardInterrupt, EOFError):
print ("Error: Unable to calibrate turret. Exiting...")
sys.exit(1)
def motion_detection(self, show_video=False):
"""
Uses the camera to move the turret. OpenCV ust be configured to use this.
:return:
"""
VideoUtils.find_motion(self.__move_axis, show_video=show_video)
def __move_axis(self, contour, frame):
(v_h, v_w) = frame.shape[:2]
(x, y, w, h) = cv2.boundingRect(contour)
# find height
target_steps_x = (2*MAX_STEPS_X * (x + w / 2) / v_w) - MAX_STEPS_X
target_steps_y = (2*MAX_STEPS_Y*(y+h/2) / v_h) - MAX_STEPS_Y
print ("x: %s, y: %s" % (str(target_steps_x), str(target_steps_y)))
print ("current x: %s, current y: %s" % (str(self.current_x_steps), str(self.current_y_steps)))
t_x = threading.Thread()
t_y = threading.Thread()
t_fire = threading.Thread()
# move x
if (target_steps_x - self.current_x_steps) > 0:
self.current_x_steps += 1
if MOTOR_X_REVERSED:
t_x = threading.Thread(target=Turret.move_forward, args=(self.sm_x, 2,))
else:
t_x = threading.Thread(target=Turret.move_backward, args=(self.sm_x, 2,))
elif (target_steps_x - self.current_x_steps) < 0:
self.current_x_steps -= 1
if MOTOR_X_REVERSED:
t_x = threading.Thread(target=Turret.move_backward, args=(self.sm_x, 2,))
else:
t_x = threading.Thread(target=Turret.move_forward, args=(self.sm_x, 2,))
# move y
if (target_steps_y - self.current_y_steps) > 0:
self.current_y_steps += 1
if MOTOR_Y_REVERSED:
t_y = threading.Thread(target=Turret.move_backward, args=(self.sm_y, 2,))
else:
t_y = threading.Thread(target=Turret.move_forward, args=(self.sm_y, 2,))
elif (target_steps_y - self.current_y_steps) < 0:
self.current_y_steps -= 1
if MOTOR_Y_REVERSED:
t_y = threading.Thread(target=Turret.move_forward, args=(self.sm_y, 2,))
else:
t_y = threading.Thread(target=Turret.move_backward, args=(self.sm_y, 2,))
# fire if necessary
if not self.friendly_mode:
if abs(target_steps_y - self.current_y_steps) <= 2 and abs(target_steps_x - self.current_x_steps) <= 2:
t_fire = threading.Thread(target=Turret.fire)
t_x.start()
t_y.start()
t_fire.start()
t_x.join()
t_y.join()
t_fire.join()
def interactive(self):
"""
Starts an interactive session. Key presses determine movement.
:return:
"""
Turret.move_forward(self.sm_x, 1)
Turret.move_forward(self.sm_y, 1)
print ('Commands: Pivot with (a) and (d). Tilt with (w) and (s). Exit with (q)\n')
with raw_mode(sys.stdin):
try:
while True:
ch = sys.stdin.read(1)
if not ch or ch == "q":
break
if ch == "w":
if MOTOR_Y_REVERSED:
Turret.move_forward(self.sm_y, 5)
else:
Turret.move_backward(self.sm_y, 5)
elif ch == "s":
if MOTOR_Y_REVERSED:
Turret.move_backward(self.sm_y, 5)
else:
Turret.move_forward(self.sm_y, 5)
elif ch == "a":
if MOTOR_X_REVERSED:
Turret.move_backward(self.sm_x, 5)
else:
Turret.move_forward(self.sm_x, 5)
elif ch == "d":
if MOTOR_X_REVERSED:
Turret.move_forward(self.sm_x, 5)
else:
Turret.move_backward(self.sm_x, 5)
elif ch == "\n":
Turret.fire()
except (KeyboardInterrupt, EOFError):
pass
@staticmethod
def fire():
GPIO.output(RELAY_PIN, GPIO.HIGH)
time.sleep(1)
GPIO.output(RELAY_PIN, GPIO.LOW)
@staticmethod
def move_forward(motor, steps):
"""
Moves the stepper motor forward the specified number of steps.
:param motor:
:param steps:
:return:
"""
motor.step(steps, Adafruit_MotorHAT.FORWARD, Adafruit_MotorHAT.INTERLEAVE)
@staticmethod
def move_backward(motor, steps):
"""
Moves the stepper motor backward the specified number of steps
:param motor:
:param steps:
:return:
"""
motor.step(steps, Adafruit_MotorHAT.BACKWARD, Adafruit_MotorHAT.INTERLEAVE)
def __turn_off_motors(self):
"""
Recommended for auto-disabling motors on shutdown!
:return:
"""
self.mh.getMotor(1).run(Adafruit_MotorHAT.RELEASE)
self.mh.getMotor(2).run(Adafruit_MotorHAT.RELEASE)
self.mh.getMotor(3).run(Adafruit_MotorHAT.RELEASE)
self.mh.getMotor(4).run(Adafruit_MotorHAT.RELEASE)
if __name__ == "__main__":
t = Turret(friendly_mode=False)
user_input = input("Choose an input mode: (1) Motion Detection, (2) Interactive\n")
if user_input == "1":
t.calibrate()
if input("Live video? (y, n)\n").lower() == "y":
t.motion_detection(show_video=True)
else:
t.motion_detection()
elif user_input == "2":
if input("Live video? (y, n)\n").lower() == "y":
_thread.start_new_thread(VideoUtils.live_video, ())
t.interactive()
else:
print ("Unknown input mode. Please choose a number (1) or (2)")
Given my long history of doing unwise and silly things to computers, a cyberdeck was inevitable. After all, I’ve turned a milk crate into a PC case, a Mac into a fish tank, a flare gun case into a wifi repeater, and so on.
Since I’ve been doing a lot of stuff with old satellite dishes lately, I figured it was time for a more organized, self-contained control and receiver setup. The result is above, made from a surplus police car computer, digital satellite meter, various software-defined radio stuff, and nearly the entire contents of multiple spare parts bins.
Believe it or not, this started out almost modern, with netbooks, SSDs, USB 4.0, Displaylink screen, etc. None of which worked the way I wanted. As usual I fell back on older and more familiar hardware. Everything in the current cyberdeck version is Windows-XP era. I’m actually using Q4OS Linux, but it looks and feels just like Windows XP!
The whole thing probably cost under $50, as most of it was stuff I already had lying around. I did spend the big bucks on a new 60% keyboard and a couple battery packs. The hardware includes:
– Touchscreen computer – RTL-SDR radio – Various filter / amp modules – Various WiFi modules – Satellite Meter / digital video player – PTZ control for my older dish pointers – LNB power injector – King-Dome / Vuqube control – Panel-mount port interfaces
The system is fairly modular and expandable, with most major components simply stuck in with velcro tape. If I want to swap modules for a different experiment, I can just pop them out and replace with something else. Most of the ports and controls are exposed in two custom-made panels. A friend asked if I were getting these laser cut… nope, just plexiglass on the table saw, print the layout on a vinyl sticker, and slap em together!
As usual, my finish and quality control are… questionable at best. I am more of a duct-tape, hot glue, and hammer artist than a fine detail artist.
You can check out the build video and some demos of this unit’s abilities on my Youtube channel! I also have a bunch of other videos of satellite-related projects and stuff, with more to come!