Wow! This is one of the coolest time-lapse videos I’ve ever seen. Sean Goebel, an astronomy graduate student, ascended the 14,000 ft summit of Mauna Kea in Hawaii to capture the observatories at work. The footage was filmed over three nights last April, and yes, the lasers are real.
So what do the lasers do?
They function on the principle of adaptive optics. In brief, they are extremely powerful 15-40 watt lasers (1000+ times more powerful than your laser pointer) that track atmospheric turbulence. Winds in the atmosphere can blur out the fine detail of the stars (the reason stars twinkle), and the telescope can use the information from the lasers to make slight adjustments to cancel out the blurring. This ultimately creates a better image of the sky.
What kind of gear was used?
Straight from Sean:
“I shot the montage on a Canon 5D Mk. II and an old-as-dinosaurs Rebel XT. I’m trying to run the XT into the ground (the shutter is rated for 50,000 photos and I’ve taken about 70,000), but it refuses to die, so I keep using it. When the shutter dies, I plan to fill the mirror box with dirt, plant a cactus in it, and then buy an actually decent second body. Anyway, the 5D II was usually paired with either a Rokinon 24mm f/1.4 or a Tokina 16-28mm f/2.8, and the Rebel XT was usually used with a Tokina 11-16mm f/2.8. Additionally, a Rokinon 14mm f/2.8, Nikon 14-24mm f/2.8, Bower 35mm f/1.4, and a Sigma 50mm f/1.4 were each used for one scene. I also used a home-built rotary table to create camera motion in some of the scenes. My timelapse dolly lives in a closet in California, so it wasn’t used in this montage.”
I shot all my images in RAW format (yep, that’s a lot of space). Images were edited in Adobe Camera Raw (part of the Photoshop suite). To add adjustable crops/white balance/etc., I ran the images through a moderately buggy program called LRTimelapse. Images were resized to 1080p in Photoshop and saved as jpgs. A few sequences were run through Virtual Dub with MSU Deflicker (for deflickering) or After Effects (for stabilization). The final video was edited together in Adobe Premiere.”
Many of you are very familiar with the debacle that landed on the big screen in the form of the recent Star Wars prequels. All you had to see was a CGI Jar Jar Binks to know that the gritty, western space odyssey you knew and loved was long gone.
The ad agency Sincerely Truman created the above video to highlight 4 rules that JJ Abrams should follow in his attempt to resurrect the Star Wars franchise.
To sum up the video, the rules are as follows:
The setting must take place in the frontier. Star Wars is a Sci-Fi Western, and the intrigue lies in the outskirts of the universe.
The future is old — no shiny spaceships, metallic droids, etc. It must be gritty.
The Force is mysterious. You don’t need to tell us how it works.
“Star Wars” isn’t cute. This is where I think JJ Abrams will certainly excel. He turned Star Trek into a violent, cut throat universe and I think he can do the same for Star Wars.
If you’d like to join the movement, head to this site to add your support.
Joshua Jennings and Garret Stuber of the University of North Carolina at Chapel Hill recently developed an experiment to “turn off” hunger in a genetically-modified mouse. The process utilizes a technique known as optogenetics (discussed before here). This technology essentially means that you can use a laser to control certain cells in the brain, and afterwards, observe what happens to the behavior of the animal. In this case, the researchers successfully manipulated neurons in the bed nucleus of the stria terminalis (BNST), which have been known to regulate hunger through their actions on the lateral hypothalamus.
As you can see in the video above, when the laser activates, the mouse immediately begins to eat, and when the laser inactivates, the mouse stops eating. It’s really quite amazing!
Of course, it would a long time before anything like this could work in humans. A key factor in this sort of experiment is that the mouse has genetically-engineered cells which respond to light, but this research does represent a first step in understanding how to manipulate neurons to control complex urges such as hunger.
If this sparked your interest, you can read more about the Stuber Lab and its research here, and if you’d like to read the article for yourself (with subscription), headhere.
Bot & Dolly is a self-described “small company with big robots.” Specifically, they’re an engineering and design firm that is attempting to use their “big robots” to revolutionize filmmaking. This recent project, known simply as “Box,” implements the 3D projection-mapping process to create a truly magical demonstration.
Projection Mapping is a rather old concept (dating back to the Haunted Mansion at Disneyland in the late 1960’s), but it has recently come into prominence with the development of specialized hardware and software. Almost any surface can be used to display the 3-dimensional images, so this technology has wide-ranging applications.
It will be exciting to see what this design firm produces next. Find more robotics at Bot & Dolly’s site.