Head on over to the vedphoto store where you can purchase items from past exhibitions to help support future public exhibitions, student internships and scientific research! Any support is much appreciated in helping me raise funds for new equipment to process all of the 3D Reactive Reefs reef mapping data and trip expenses.
Holiday Sale - 15% off orders over $50 with code: VEDPHOTOSALE2014 & free shipping on all orders now until January 15, 2015!
]]>
Stanford Student ARTS brochure
]]>
Thanks to a generous grant from Barbara Hibino and Carl Page at the Anthropocene Institute, I was able to share the Reactive Reefs exhibition at the decadal IUCN World Parks Congress in Sydney Australia last week.
I was invited to share a portable version of the Reactive Reefs exhibit at the Oceans+ Pavilion as well as participate in a plenary with Dr. Sylvia Earle (can you believe it!) and Dan Laffoley titled "The Future is Cool."
With my colleagues Erica Parke and Blanche D'Anastasi, we presented another talk at the Oceans+ Pavilion titled "Stromatolites, Quadcopters and Sea Snakes." We discussed some of the latest results from our joint collaboration to map the stromatolites of Hamelin Pool, Western Australia and, among other things, what a physicist, biologist and geologist all have in common!
It was truly a phenomenal experience to be surrounded by caretakers of planet Earth and learn about their tremendous efforts and contributions to the worlds most precious resources.
Interview at the Oceans+ Pavilion after the Stromatolite talk (16:00 onwards):
New video rendering showing the latest Reactive Reefs coral map data captured in American Samoa:
Fluid Lensing Overview
Coral Diorama demo
Reactive Reefs exhibit at the World Parks Congress one pager:
World Parks Congress Trip Slideshow:
]]>
Imagining the universe 2
Below you can see an overview of the interactive 3D Coral Diorama. Also, you can read more about how we made this diorama in the paper below by my summer interns at NASA Ames, Charlene Cuellar and Megan Prakash. We look forward to sharing the next three dioramas we are putting together soon! If you would like to see this firsthand, come by the NASA booth at AGU in SF Dec. 14-19. Reactive Reefs 3D Coral DioramaPrototype for the Reactive Reefs Exhibition
Fluid Lensing overview:
Fluid Lensing Overview
Coral diorama paper by Charlene and Megan:
Mapping mission testHamelin Pool, Shark Bay
This was hard to keep a secret, so many already know, but I am thrilled to share the news of an exciting collaboration with Dr. Pamela Reid's group from the University of Miami to map Stromatolites, in Hamelin Pool, Shark Bay, Western Australia, home to Earth's oldest reefs and organisms, for another high-resolution underwater mapping effort for the Reactive Reefs project. I am in the field from mid-March until the end of April mapping Stromatolites around Shark Bay (photo below by Ergin Karaca).
The hypersaline waters here have kept this microbial community thriving and also helps to keep the massive sharks at bay, though the venomous sea snakes love it! The sub-cm resolution 3D maps I will generate are of great interest to research teams around the world and, in particular, to NASA's astrobiology researchers as Stromatolites are Earth's oldest lifeform as evidenced by fossil records dating back to 3.5 billion years ago! All that O2 you love, thank your Stromatolite friends for oxygenating our atmosphere.
Arrival at Hamelin Pool, Shark Bay, Western Australia:
Stromatolites in Hamelin Pool!Pustular mat stromatolites at low tide at Flagpole, Shark Bay
Arrival at Hamelin Pool
Getting stuck in coquina shells. Can you imagine that entire shorline is made up of these little guys?
Stuck in shells
Sharknado?:
Shark BayWe found this little guy on the trail to Hamelin Pool. Sharknado?
Octocopter being tested and prepared for mapping missions (below). This guy decided to put me through the gamut of electrical woes. ESCs mysteriously shorting out, calibrations becoming inaccurate after a few flights, and gains changing with even the slightest change in payload configuration!
OctocopterReady for mapping mission
Fortunately, after lots of soldering and reassembly, I was able to get this thing stabilized and flying:)
Octocopter test flight in PerthFinally stable!
On the road to Hamelin Pool from Perth
Sample flight video below (resized for slow internet here):
Flagpole short flight
]]>
These are the first data sets processed on the new computer setup (funded by you!) and Agisoft's generous donation of their professional GIS software.
Hover over figures to see descriptions below:
Fluid lensing on aerial Samoa DataRAW aerial image on the right and 45-frames (0.75 seconds of video data) post processed using Fluid Lensing on the left. Notice under sampled original image with compression artifacts, caustics and wave distortions renders nearly all structure indistinguishable. Lensed data show clear features and no caustics.
Fluid lensing removes surface distortions that typically limit 3D reconstruction ability. Below, I show the fluid lensing enabled 3D reconstruction using structure from motion for a segment of flight data. Note boundary regions show errors due to insufficient data and overlap in those regions, but the full data set will merge all flights for sufficient overlap. Data are rendered at 1/10th actual resolution.
3D Coral Reef ReconstructionUsing structure from motion combined with Fluid Lensing (preliminary 1/10th resolution dataset)
]]>I am in the midst of my PhD qualifying exams, so bear with me until I am done on the 12th of November and can get back to processing all the coral mapping data from our trip to Ofu!
Update (December 13, 2013) - I passed! Officially a PhD candidate. Some new results added below. I am in sore need of new equipment to process all the 3D data - check out the new Vedphoto store and holiday sale if you can. All proceeds go to funding the Reactive Reefs project and providing this data free to the public!
However, as part of my research presentation, I did manage to make the first 3D map of the underwater reef from just a snippet of the full data set. After many grueling computer hours and over 9 hours of processing on an 8-core machine + GPU I have at last a preliminary 3D map! The reconstruction actually worked much better than I had imagined, though to finish the entire map I will surely need much better computational resources. The maps below have over 3.5 million polygons! Need to bump that up to ~100 million for the maximum resolution.
The basic idea behind the 3D reconstruction is to use otherwise discarded data from the Fluid Lensing process and exploit the motion of the quad-copter over the reef. Fluid Lensing removes all the surface distortions from image sets, so I can accurately triangulate the distance to each voxel (three-dimensional pixel). Once I have a point cloud of data points, the computationally intensive process of fitting a surface to those points begins. Once the surface texture is computed and discretized into millions of small polygons, I can overlay the high-resolution texture that Fluid Lensing provides. Below is a schematic of how the imaging was captured. As the quadcopter moves about the surface, it creates a stereographic baseline to view the same surface from multiple points (like your eyes). This allows a distance to be computed and a 3D surface to be rendered.
3D reconstruction
Here is a close up of the rendered 3D surface showing the 3D geometry of the corals beneath the ocean surface!
coral high shading render
Here is the 3D modeled covered with the high-resolution Fluid Lensed map:
coral high texture render
And a brief animation exploring the 3D environment. Sorry about the lag - I will need to get the funds together to build a dedicated 3D map-processing desktop that can handle the graphics and reduce the render time. Preliminary Coral Reef 3D map
More to come soon!
]]>
Thank you for supporting my mission and I hope you enjoy your hand-made piece!
]]>
I will have five large-format pieces on display featuring my latest work on brushed aluminum. Come on out, have a drink and support the Arts in Tech through these nonprofits! Info and event tickets available here:
]]>
Several of my worker portraits from Kenya, Uganda and California will be on display as one of the Samasource Give Work Gala 2013 sponsors. Learn more about how to give work and not just aid this November 1st at The Regency Ballroom in San Francisco. Tickets for the event below. Last year's entertainment and model for me: Estelle.
]]>
Front page of Samoa News:
Quadly at work:
Preliminary depth interpolation using the moving quadcopter as a stereographic baseline -
Original:
Interpolated depth (a work in progress, but should be able to resolve 5-10 cm in z):
And the now famous Coral Rap feat. Trent & Tamaki:
]]>
Ofu Island, American Samoa
Below is a quick first look at the coral mapping from 4 seconds of flight data from today (over 15MP!). This image is without HiMARC Fluid Lensing, so expect another 4-10 fold increase in resolution. 4 seconds of flight data out of the 30,000+ seconds we expect to generate!
Finally, I feel like I am making a contribution to science. The highest resolution coral map to date. SCIENCE!
Learn more about our Reactive Reefs project here.
Close up:
Reactive Reefs is an interdisciplinary coral reef imaging and mapping project that features award-winning advanced imaging techniques to provide centimeter-scale optical aerial maps and underwater gigapixel panoramas of at-risk coral habitats. Using a technique we developed called Fluid Lensing, we are able to use perturbations and small waves in the ocean’s surface to image shallow subsurface marine targets with minimal artifacts and distortion from an aerial platform with enhanced angular resolution. Summer 2013, our team is travelling with the Palumbi lab to Ofu island in American Samoa to generate maps of coral reefs of interest using Fluid Lensing from a small electric quad-copter equipped with our imagers. The effort will provide Stanford’s Palumbi and Pringle labs with valuable scientific data on coral health and distribution as well as one of the highest-resolution optical maps of a subsurface marine target to date.
In addition to its scientific mission, Reactive Reefs is a collaboration with Stanford’s Arts Institute and will culminate in an immersive science outreach exhibition that aims to transport the public beneath the ocean’s surface and convey firsthand how the world’s coral reefs change as a result of both natural and human pressures. Further information about Reactive Reefs, preliminary imaging results and Fluid Lensing is available at: www.vedphoto.com/reactive-reefs and www.vedphoto.com/himarc
Remember, all of this work is supported by people like you through my exhibitions. My latest exhibition, Above, Below & In-Between closes August 28th at the Sacramento Contemporary Art Gallery!
Latest:
August 19, 2013 in Fagatele Bay, American Samoa
August 20, 2013 on Ofu Island:
1000+ year old coral with Olosega in the background:
An underwater panorama preview:
The first quadcopter imaging test over coral!:
]]>
Over the past three years, I have donated more than $35,000 in professional photography and videography services to Samasource, an innovative non-profit social business that connects women and youth living in poverty to dignified work via the Internet.
Founded in 2008, Samasource has provided life-changing employment opportunities for thousands of people. Traveling on assignment to Kenya in 2011 with the Samasource team, I got to see firsthand the impact of their work as I interviewed the young men and women who help support their families and send their siblings to school. This year, I travelled to rural Uganda to document their workers' stories and work throughout Uganda.
Below is a preview of Dennis' story. Apologies for the audio, a better stream will later be available. Viewer discretion advised.
]]>
Reactive Reefs team members Megan Prakash, Leo Kheyn-Kheyfets, and Corinn Small investigated coral bleaching through timelapse macro photography, creating a video of the Aiptasia anemone's bleaching process over a 48 hour period. The Reactive Reefs project, done in cooperation with Stanford's Pringle Lab, will aid coral conservation research. A big thanks to Tamaki Bieri and Cory Krediet for their help!
The main issues impeding our timelapse photography were the battery life of the camera equipment and the restrictions of the macro lens' field of view and depth of field. Here was our eventual device:
It was the fourth iteration in a long series of trial-and-error creations...
Our first setup was intended to inform us about the anemone's movement and the length of time it would take for it to bleach fully. After placing an anemone in the incubator at 34 degrees C for three hours to stress it and trigger the bleaching process, we moved it to a petri dish outside the incubator and under our cameras.
A D7000 on a tripod and three flashes (one commander, two slaves) captured a photo once every two minutes, and a Canon camcorder took video. The setup was in a dark room, though the anemone was lit continuously by a blacklight so it would be visible in the video.
Our first attempt
The first setup ended up informing us about more than just the anemone – we also realized how impractical our setup was. The type of Aiptasia that we used would take a week to bleach fully outside of the incubator, and the size of the petri dish meant that the surprisingly mobile polyp escaped the camera's view within a couple of hours. Oops.
Our second attempt featured a smaller petri dish and a camera positioned directly above the dish to minimize distortion and keep the anemone in a single focal plane. We also simplified the lighting by placing the petri dish in a black tank with a single flash. The photo quality increased, but again the battery lives of the equipment could not outlast the bleaching process. Our results were 152 pristine macro photos... and then 450 pitch-black photos when the flash inside the tank ran out of battery.
The second setup, with only 1 flash lighting the anemone and the camera positioned at the zenith of the dish
Third setup: we went in a different direction to circumvent the flashes' battery lives. I decided to make a system that would fit inside of the 34C incubator, shortening the bleaching process to two or three days. Enter the “Nikon iTripod 2000” and a black bucket:
Two anemones confined in a spot plate, the same single-flash setup, and a cardboard DIY solution that fit perfectly in the lower shelf of the incubator. Success was tantalizingly close... alas, the next morning I discovered that one of the batteries in the slave flash had begun to leak, leading to the familiar “200 perfect photos, 400 pitch-black photos” result. Also, the cardboard ended up sagging under the camera's weight, causing the frame and the focus to drift from the anemones. The picture quality was great, though:
Fourth time's the charm, right? The “Nikon iTripod 3000” featured an innovative foam structure that supported my D300s, battery grip, and SB900 without complaint; it retained the same black bucket body and single-flash setup. We christened it the “Vedphoto Hydrophotography Solution,” or “VHS,” due to its similarity to a large blue VHS tape. Corinn and I slid it into the incubator with a prayer and a fond farewell.
Two people, three cameras, two tripods, foam, and a bucket... on one bike.
The iTripod 3000 featured duct-tape-reinforced edges and a body sculpted from a single piece of blue foam.
We ran in every 8 hours to put in fresh batteries and realign the camera. After nearly a week and a half of refining our timing -- Success! We produced the video at the top of this blog post, a compilation of a photo every minute for nearly 48 hours.
]]>
Above, Below & In-Between
On exhibit August 2013
@ The Sacramento Temporary Contemporary
Opening Reception 8 August at 18:00
"Technology Behind the Images" Talk 10 August at 17:00
Above, Below & In-Between offers a perspective of our galaxy, life and perceptions of temporal and spacial scale. For this photography exhibition, I pioneered a new imaging technology called Fluid Lensing and used mathematical projections and mapping techniques to capture unprecedented high-resolution photographs of celestial events, including the Venus Transit and Supermoon, the tallest organisms on Earth, Sequoia sempevirens of Northern California and marine organisms including an array of coral species imaged for the first time through the surface of the water. I explore the three main realms of our existence, the land, the sea and the sky, and reveal some of their unique denizens.
As a photographer and a physicist, I aim to share my fascination with the nature of light through the photographic medium. My work features diverse facets of the visual world, including the discovery of an extrasolar planet in high school with amateur equipment, Vogue fashion shoots in Moscow, tsingy and bats in Madagascar and celestial phenomena. I did my undergraduate degree in Astrophysics at Moscow State University and Stanford’s Physics Department. Currently, I am a graduate student in Aeronautics & Astronautics at Stanford and scientist at NASA Ames Research Center.
Above, Below & In-Between Exhibition Pieces on Display
Hear the story here: http://www.npr.org/2013/08/05/209144805/no-tax-dollars-went-to-make-this-space-viking-photo
]]>My entire science outreach exhibition, Physics In Vogue, including the Space Vikings photoshoot and Discoverer of Worlds pieces is funded by the 2011 Stanford Angel Grant, SiCA Spark! Grants and readers like you. However, it seems a congressional inquiry was required to confirm this.
Just as a reminder - no NASA or other governmental or organizational funds, resources or employee time were used to make these pieces. All images appearing in the finished pieces feature my own images, captured with my telescopes and are not NASA imagery. NASA cannot afford to promote their own missions in this way and this is partly why I started this project in 2011, yet more was probably spent in tax-payer employee man-hours investigating me, my exhibition and those involved than it might have cost them to do it themselves.
Oh well, at least the science is being spread! I am only done with 2 of the 10 pieces, but I am putting finishing touches on the 3rd. Thank you all for your support and encouragement throughout this process!
]]>
Part of the Reactive Reefs team (Ved Chirayath, Tamaki Bieri, Megan Prakash, and honorary member Andrew Elmore) spent 6/22 in Catalina Island's kelp forests, (attempting to) test a setup for underwater gigapixel panoramas. The trip was a practice run for a dive at Ofu, in American Samoa, where the team plans to image coral reefs both from an aerial quadcopter platform using HiMARC's Fluid Lensing as well as underwater with gigapixel imaging techniques. Stay tuned for preliminary results! Below are some features from our trip.
Trip video:
We got an early start on Saturday morning, leaving Los Angeles for Long Beach at 5:30am to catch the first ferry to Catalina Island. Once there, we rented diving equipment, had breakfast, and then quizzed some locals on suitable places to photograph, eventually deciding to try both Lovers Cove and Casino Point.
Megan and Tamaki
As the sun began to emerge from the morning clouds, we struggled into our wetsuits on the rocky beach of Lovers Cove, then assembled the camera equipment and struck out into the water. The plan was to find small clearings within the cove's kelp forest that would make ideal locations for the center of a panorama. Luckily, there were several of these, so we spent the rest of the morning in the water helping Ved capture photos. Since he was turning in full circles to make 360 degree panoramas, the rest of us had to swim along behind him to stay out of his field of view, looking like some kind of amateur synchronized swimming team!
Several sacrifices were made for the sake of science – our tripods were too flimsy to resist the water and eventually fell prey to forces of nature, though we tried to improvise with diving weights to hold them in place. Ved also cut his ankle pretty badly on sharp barnacles while trying to get out of the water, but continued valiantly with his camera through stinging salt water and curious fish.
Casino Point, our second location, was about a mile's walk from Lovers Cove. The area was much more crowded, with several scuba divers and snorkelers exploring the water. We did find some photogenic areas and took more photos, then spent the rest of the day enjoying the island.
The setbacks prevented us from practicing with a final setup for Ofu, but we learned a lot about what we needed to be successful. Next time, we'll definitely pack (1) a much heavier tripod, (2) booties and gloves, and (3) a buoy to help carry the heavy equipment in the water. Hopefully, fewer DIY solutions will be needed.
]]>
This past week we constructed and tested an Arducopter Quadcopter platform for aerial imaging tests of HiMARC's Fluid Lensing on underwater targets. Ultimately, we hope to image coral reefs through the water's surface as part of the Reactive Reefs project.
This technology may help marine biologists study the bleaching effects of coral without having to dive down into the water. The people working on this project are Trent Lukazyk, who is engineering and flying the Quadcopter, and myself, Leo Kheyn-Kheyfets, who is helping with the engineering, photography and documentation of the Quadcopter systems as well as the HiMARC and Reactive Reefs teams.
After much anticipation, on June 24, we began working on our Arducopter Quadcopter. After some work, we were able to fully construct the Arducopter Quadcopter. Our next step was to calibrate both the Quadcopter and our R/C Controller with the provided Mission Planner software. After we had both the Quadcopter and Controller fully calibrated, we were ready to take it outside for a test drive.
Putting the Quadcopter in "Stabilize Mode" we were able to get it off the ground with ease and flying through the air. While in the air, we used the "Loitering" and "Altitude" modes. After getting comfortable with the controls and flying the Quadcopter, the next day we were able to perform a full run with Quadcopter.
Starting the run at 9:17 a.m., we were able to fly the Quadcopter in several modes till the battery got to low at 9:35 a.m. and we were not able to get it off the ground anymore.
Going through the data log of the first full test, we found that during the 17 minute run, the battery power started at 12.3 V and ended at an 8.6 V where we were no longer able to lift it off the ground. After the battery voltage got down to 10 V, the battery started dying very quickly and got down to 8.6 V in a a matter of seconds. When the Quadcopter is just loiter, it is drawing 15 Amps, but during the maneuvers, it draws up to around 18 Amps.
The Stabilization mode was used to take off, and works by trying to maintain a level aircraft. The Loiter Modeis used to hold the Quadcopter in all x,y, and z directions, and then returns to a set point position. While the Altitude mode works at keeping the Quadcopter at the same altitude and only moving the Quadcopter in the xy plane, and the throttle then changes the altitude.
Here are some logs produced from the first full run:
Some photos from the construction and testing:
]]>
We are offering a $1000 cash reward or $2000 credit in the vedphoto store for anyone who successfully recovers our payload (with memory cards intact) by September 16. After September 16, we will be returning to search the area ourselves.
What you are looking for: A red parachute, silver radar retroreflector and payload box with cameras inside. Please DO NOT OPEN the box if found, we need to open it. The balloon will have popped, so you may only see shards of it remaining:
Where to look: See map below. Contours indicate probability of landing site ( within red, highest chance of balloon landing in designated region). Some areas have been removed as we already conducted a search there.
Download the full resolution map here (imagery by Google Earth, compiled for search use only):
http://www.vedphoto.com/himarcballoon1/h676B9D4D
Remember:
Please be safe and prepared before setting out into this region. Temperatures can reach 115 F and it is miles away from major roadways. In addition, please keep up to date on local forest fires and do not start one yourself!
]]>HiMARC High-altitude Balloon Launch to test Fluid Lensing's correspondence principle launches from Grand Canyon, AZ
A successful launch from the Grand Canyon and plenty of adventures! Check out the links below to see our launch videos, timelapse sequences and journey. We launched northeast of the Grand Canyon, just north of Tuba City, AZ. After visually tracking our payload for half an hour, we set out to the predicted landing site and were receiving telemetry up to 97,000 ft!
Then the real adventure began after that, but suffice it to say we came home empty handed for now. To that end, we would like to announce a $1000 reward, or $2000 vedphoto order credit, to anyone who retrieves our intact payload by July 16. We have made some sophisticated maps to help in the search and have narrowed down our 95% confidence area to a region about five square miles in size.
Recovering our HiMARC balloon payload will help us complete our mission to address the atmospheric Fluid Lensing correspondence principle as well as provide data for the GPS group!
Setup timelapse:
After Ashish's optimistic "see you in 2 hours," the real search began! Please excuse any expletives:
T +14 hours!
]]>
Current pipeline:
Helmholtz wave equation with random perturbations: Simulated refractive effects from perturbed surface:
Resolved coral image with Fluid Lensing:
Craigman of Oceanside Photo & Telescope will also be coming along for the ride =>
Special thanks to Max Praglin of RocketChutes.com for the donated chute that will bring our payload home safe!
Recent slide summary of project:
Balloon construction photos by Bryan Chan:
More to come soon!
]]>
Hours: Friday, May 31st 15:00-17:00
Info: Admission free. Y2E2 Red Atrium, near Coupa Café.
HiMARC + Reactive Reefs Fluid Lensing testing at Stanford's Avery Pool
Testing overview:
Underwater test video:
HiMARC Fluid Lensing Testing - Underwater imaging setup, zenith pointing at 2.28 m depth and 0.94 m surface to target
Recorded at 1280 x 960, 100 fps, broad spectrum, compressed video feed on GoPro Hero 3
<-- RAW cropped frame showing multiple lensing events and under-sampled data
Post-processed preliminary result showing corrected geometry, dithering, transient rejection and artifacts from DWTs ->
THIS SHOULD EXCITE YOU! At least, I was excited...
Below are the RAW frames to show you the levels of distortion I was making. Also, we tried to move around the target (xy plane only) to try and make our lives even more difficult, but prevailed in resolving details on my resolution target. Notice the significant lensing events. You can actually make out some of the resolution target features by eye!
More to come soon from air to water testing and high definition results. Consider supporting this work by sprucing up your home with some of my prints...
]]>
Weaner pups
HDR Panorama of Año Nuevo State Park with elephant seals in the foreground
Weaner pup noises:)
Full gallery
]]>
Special thanks to my labmates in the Aerospace Design Laboratory for being such excellent models: Tom Economon, Heather Kline and Amrita Mittal.
What: HiMARC Project Presentation by Ved
Where: Space Environment & Satellite Systems Lab Meeting
Durand Building, Room 026 (Basement level)
Stanford University
When: 16:00 - 17:00
Cookies: Yes
Directions:
A preview:
]]>Misha and I embarked to Williams Hill for a two-day extended object imaging run with the telescopes. No one was silly enough to join us for the 24F nights, but we had a great time together and it was well worth the data!
Our camp setup (a serious upgrade over the traditional backpacking rig) -
The telescopes in the background and a Misha who prefers my sleeping pad to his bed -
I took a nice GoPro time-lapse setup sequence, but the low temperatures killed the battery prematurely, corrupting the entire 64GB SD card:(
Fortunately, the other equipment was much more rugged and coped fine. I shall post a DSLR time-lapse soon of the imaging night. Below is a single calibrated frame from the Horsehead Nebula imaging session taken with my Nikon D7000 on the 8'' Imaging Newtonian. Note the secondary reflection from the coma-corrector optic (later removed).
Calibrating and processing over 400 36 megapixel frames (16 hrs of processing and ~600 GB of data) at home on two 8-core machines at once with PixInsight.
The final Horsehead Nebula shot (not that great, but a start). Flame nebula to the bottom left and the horsehead on the right. HDR multiscale-wavelets, Atrous wavelets, dynamic background calibration. I shall try to post a more blow by blow processing account soon.
The high and low pixel rejection maps from the winsorized sigma-clipping image integration (after image calibration and registration). Can you spot the meteor trail?! There are also two satellite trails on the left and a serious internal reflection on the right.
And some other post processed images from the two nights (on a 200mm Nikon lens, 105 mm Nikon lens with the D800 and the D7000 at prime focus of the 8'' Imaging Newtonian):
Orion Nebula:
Wide-field shot of Orion's belt and sword, including Horsehead and Orion Nebulas:
The morning after:
And some awesome physics at sunrise:)
I shall post some more images soon and update the Astrophotography collection with high-resolution versions of these files after some more post-processing. Stay tuned!
]]>Item | Purpose | Owner/Supplier |
Optics | to see | physics |
Coronado H-alpha 80mm solar telescope
|
Solar imaging, testing atmospheric lensing & lucky-imaging techniques on high SNR data (the Sun). Hydrogen-alpha single stack tunable etalon interferometer filter operating at 656.26 nm with a bandpass of only 0.7nm! Used for every solar image taken thus far as part of HiMARC ground tests. | Purchased from OPT for $3500, yikes! / optcorp.com |
Meade 8'' LX-200 SCT
|
200mm aperture fork-mounted Schmidt-Cassegrain telescope. 2000 mm focal length (f/10). Used for most lunar and planetary images taken as part of HiMARC ground tests, sometimes with 2-5x barlow lenses. Good go-to performance and thermal stability in tube, but not suitable for extended exposure astrophotography without a wedge (ours cracked). | Courtesy Pumpkin Inc. / Meade Instruments |
Astro-Tech 8'' f/4 Imaging Newtonian
|
Wide-field astrophotography and some planetary and lunar imaging with 2x teleconverters. Fast f/4 scope is very sensitive to collimation errors, so a good Cheshire eyepiece is a must. Used to take most of my wide-field astrophotographs (800mm focal length) with a DSLR or other camera mounted and Orion Autoguider on the CG-5 or CGEM mount. Must be used with a coma-corrector and then some.. | Purchased from Astronomics for ~$600 / Astro-Tech |
Astro-Tech coma corrector
|
2'' coma corrector for the Imaging Newtonian | purchased for ~$220 / Astro-Tech |
Hutech IDAS Light Pollution Filter
|
It has a very convenient bandpass for Palo Alto-like locations:![]() |
Purchased for $220 from Hutech / Hutech |
Celestron 14'' SCT on CGEPRO![]() |
A beast of an SCT - this is really only suitable for planetary and lunar imaging. Try autoguiding at 4000mm focal length and f/10! On loan from the generous folks at Oceanside Photo & Telescope. Used for HiMARC ground tests on lunar and planetary targets where noted. Contending with dew on the corrector plate is no small task - even after pumping 80 watts to the plate with nichrome heating elements! Setting up alone in the mountains is comical... | On loan since June 2012 from OPT! |
Nikon AF-S 105mm f/2.8 |
Used for wide-field astrophotography mounted piggyback on the bigger scopes. One of the few Nikon lenses with ok coma performance wide open at f/2.8. | Purchased for $950 / Nikon |
Orion Autoguider Package![]() |
Used in conjunction with PHD auto-guiding software on CGEM mount for extended exposure tracking. 80mm refractor coupled to ~1MP CCD suitable only for short focal length telescopes. | Purchased for ~$450 / OPT |
Mounts | ||
Celestron CG-5 |
Does the job for solar imaging sessions, without the weight. Not very accurate for extended exposure astrophotography, even auto-guided. | Purchased for ~$700 / High-point scientific |
Celestron CGEM
|
The main mount I use for extended exposure astrophotography with an autoguider. Capable of handling the 30+ lbs of equipment I stack on top and much easier to operate than the CG-5. Polar alignment feature works fairly well if you know what you are doing (read, hours of learning the pain-staking way in the cold). | Purchased for ~$1500 / OPT |
Cameras | ||
Nikon D800![]() |
36 MP DSLR I use for general photography and wide-field and prime-focus astrophotography | Purchased for $3000 / Nikon |
Nikon D7000
|
16 MP DSLR I use for general photography and prime-focus imaging on the Newtonian. APS-C sensor size helps reduce areas of coma on the Newt. | Purchased for ~$1300 / Nikon |
FLI Proline TEC CCD
|
Still trying to get this to work without icing over the sensor. Stay tuned... | Borrowed from Pumpkin, Inc., price astronomical / FLI |
Point Grey Flea3 USB 3.0 3.2 MP monochrome
|
High speed imager for testing atmospheric lensing theory & lucky-imaging. 3.2 monochrome sensor used for Venus Transit imaging, ~3 micron pixel size. Software extremely buggy with crashes common on multiple systems. USB 3.0 is very new and bandwidths require SSD drive or thunderbolt speed interfaces. | Courtesy NASA / Point Grey |
Point Grey Flea3 USB 3.0 1.2 MP monochrome
|
High speed imager for testing atmospheric lensing theory & lucky-imaging. Go for the 3.2 MP sensor unless you like tiny fields of view. Very suitable for planetary targets though. Software extremely buggy with crashes common on multiple systems. USB 3.0 is very new and bandwidths require SSD drive or thunderbolt speed interfaces. | Courtesy NASA / Point Grey |
Point Grey Flea3 USB 3.0 8.2 MP monochrome
|
High speed imager for testing atmospheric lensing theory & lucky-imaging. 8 MP color sensor has bad bayer filter configuration and noise profile. Software extremely buggy with crashes common on multiple systems. USB 3.0 is very new and bandwidths require SSD drive or thunderbolt speed interfaces. | Courtesy NASA / Point Grey |
GoPro Hero 3
|
Just for setup and teardown. Also, in case they find a body frozen to the telescope and need forensic evidence. | Purchased for $500 / GoPro |
Software | ||
![]() |
One of the best pieces of image processing hand down! Extremely well written with good documentation and powerful, modern image processing tools. I could spend a life time exploring all the tools this program has to offer. Runs on multiple platforms, 64-bit architecture and handles all the huge files I throw at it. Costly, but worth the price to every astronomer in my opinion. | Purchased for 170 Euros / PixInsight |
Deep Sky Stacker |
Good beginning program to get started in extended object astrophotography - quickly calibrating light frames with bias, flat and dark calibration frames. Unfortunately, not much more beyond frame integration and limited tools for image integration. Also, 32-bit program is limited to processing small files:( Donate if you can. | Free! / Deepskystacker |
PHD Autoguiding
|
Push-here-dummy shareware software does the job autoguiding and automatically calibrating. More advanced features built-in. Sometimes buggy and slow on netbooks (could be more with the drivers for the Orion Autoguider). Donate if you can. | Free / PHD Autoguiding |
AutoStakkert!2 by Emil Kraaikamp
|
An excellent piece of Lucky-imaging software built and designed by Emil. I think this program has wide applicability to image and video processing and is a phenomenal piece of code. Donate if you can. It is the primary piece of software I use to test Atmospheric Lensing on planetary and solar targets. I am hoping to collaborate with Emil to design a similar piece of software with some of my algorithms from the ground up in C++. Stay tuned. | Free! / AutoStakkert! website |
MATLAB
|
Don't leave home without it. Even though I am terrible at actually using it, I know others who can! | ~$500, more depending on packages like Image Processing / MathWorks |
]]>
Obligatory soundtrack: Zapmama's "Supermoon"
On May 6th, 2012 a Supermoon made the night sky a bit brighter than most. Supermoons, or as wikipedia eloquently puts it, “perigee syzygy of the Earth-Moon-Sun system” is the moment when a full moon occurs at perigee relative to Earth. Usually between 222,000 (357,000 kilometers) and 252,000 miles (406,000 km) apart, at perigee the moon is only 221,802 miles away. At this point, it appears 14% larger and 30% brighter than full moons near apogee.
A comparison of last year's March supermoon (right) with an average moon from December 2010. Photo by Wikimedia Commons user Marcoaliaslama
But the Supermoon of May 6th was even brighter than most: this year the timing for reaching full moon and full perigee were highly aligned—only off by a single minute (often the two are off by a full hour). The schedules seemed almost perfectly coordinated, making this Supermoon among the brighter and larger moons of the entire 18-year Saros cycle.
Unlike most astrophotography subjects that demand remote shooting locations with minimal light pollution, Ashish and I were able to photograph the moon from a park in the middle of Palo Alto. Using HiMARC's atmospheric lensing processing pipeline, an 8'' Schmidt-Cassegrain telescope, and a high frame-rate camera we collected over 3 Terabytes of data that were later processed to create a series of large composite images for a full panorama. See the HiMARC page for resolution details and more on atmospheric lensing.
]]>
Thank you to the Vikings of Bjornstad and all the volunteers who helped make this shoot happen!
Photographer: Ved Chirayath, Producer: Michael Bush, Makeup Artist: Inna Mathews, Vikings of Bjornstad: Jack Garrett, Kay Tracy, Victoria Parker, Patricia Petersen, Brian Agron, Henrik Olsgaard and Ed Berland, Volunteers: Andrew Elmore, André Sales, Katie Zacarian, Heather Kline, Zhijian Qiao, Katheryn Bradford, James Schalkwyk, Sahadev Chirayath
The second installation of Physics In Vogue, Space Vikings, is complete! Check out the results and see videos of the process in the Space Vikings Gallery.
Space Vikings - Featuring NASA Ames Director Dr. Simon P. Worden, the Vikings of Bjornstad and CubeSat satellites
NASA Ames Research Center leads the charge in small satellite innovation and development while evoking the Viking spirit of exploration and adventure. NASA Ames Center Director Dr. Simon P. Worden poses alongside the Vikings of Bjornstad for this photograph to personify that spirit and highlight NASA leadership in the modern space age. The next generation of small satellites, known as CubeSats, float above and herald a new era in space exploration and science. These CubeSats can be launched into Low Earth Orbit at a fraction of the cost of traditional systems, allowing more frequent, and more accessible scientific missions. My PhD research project, HiMARC, is based on this platform.
This project is supported by the 2011 Stanford University Angel Grant and SiCA Spark! Grant.
No other funding was received for this science outreach exhibition.
Photographer: Ved Chirayath; Producer: Michael Bush; Makeup Artist: Inna Mathews; NASA Ames Center Staff: Dr. Simon P. Worden, Center Director, Karen Bradford, Chief of Staff, and Carolina Rudisel, Executive Secretary; Vikings of Bjornstad: Jack Garrett, Kay Tracy, Victoria Parker, Patricia Petersen, Brian Agro, Henrik Olsgaard and Ed Berland; CubeSats: Pumpkin Inc.; Volunteers: Andrew Elmore, André Sales, Katie Zacarian, Heather Kline, Zhirjian Qiao, Katheryn Bradford, James Schalkwyk, Sahadev Chirayath.
Space Vikings - Featuring NASA Ames Director Dr. Simon P. Worden, the Vikings of Bjornstad and CubeSat satellites
Behind the scenes:
Coming soon - The Vikings of Bjornstad photos!
]]>Update 20 Dec 2012: A snow storm on our trip up to Oregon has incurred a one day delay on these - check back Friday afternoon.
Space Vikings Photoshoot -
NASA Ames is at the forefront of innovation in the modern space race and we're aiming to encapsulate that spirit in an up-coming photo shoot by featuring its director, Dr. Pete Worden, chief of staff, Ms. Bradford and executive secretary to the director, Ms. Rudisel as Viking explorers. They'll be shown in full costume, 'catapulting' CubeSats-the next generation of tiny satellites-into orbit with a trebuchet. We expect this shoot to be a very memorable experience and so we'd like to invite you to come out and see for yourself. It is the second piece in Physics In Vogue, a photography exhibition that aims to shed light on profound contemporary physics discoveries by combining fashion photography, laboratory grade optical effects and scientific accuracy to create visual representations of complex science.
This project is supported by the 2011 Stanford University Angel Grant and SiCA Spark! Grant.
No other funding was received for this science outreach exhibition.
The event will be held at Palo Alto's Foothills Park (address: 3300 Page Mill Rd., Los Altos, CA - map below) and will be taking place this Friday from 13:30 to 15:30. We'll be meeting near the park's interpretive center.
We are pleased to have the help of several wonderful volunteers, including the living history group, Vikings of Bjornstad, the fantastic makeup artist, Inna Mathews, and CubeSat Satellites on loan from Prof. Andrew Kalman at Pumpkin Inc. We look forward to a memorable shoot! Join us for snacks and beverages beforehand and come with your Viking spirit of exploration!
Founded in 2008, Samasource has provided life-changing employment opportunities for thousands of people. Traveling on assignment to Kenya in 2011 with the Samasource team, I got to see firsthand the impact of their work as I interviewed the young men and women who help support their families and send their siblings to school.
I am happy to share selected images and video interviews of the workers here or in the embedded gallery below. You can see how Samasource's micro-work model is impacting individuals throughout Kenya and elsewhere. Be sure to check out their website at www.samasource.org.
Also, coming up - I am preparing 8 large-format canvas wraps for the Samasource 2012 GiveWork Gala!
]]>
September 12, 2012:
I am happy to announce that after imaging the Venus transit and Cygnus constellation with HiMARC technology and Dr. Natalie Batalha and the Kepler Spacecraft model in the studio, the first exhibition piece for Physics In Vogue, Discoverer of Worlds, is ready! The image below features NASA's Kepler Space Telescope Mission and its method for discovering extra-solar planets by photometric transit detection.
Discoverer of Worlds - Featuring original HiMARC imagery, Dr. Natalie Batalha and NASA's Kepler Mission
I was inspired to feature the Kepler Mission as Physics In Vogue's first piece as I know firsthand just how difficult it is to do this kind of science and thanks to a once-in-a-lifetime opportunity to image the Venus transit. It takes painstaking dedication, brilliant minds and years of precision analysis to accomplish what the Kepler team has in so short a time. You can learn about the technology I developed behind the imagery on the HiMARC page. Check out the Get Involved page to learn how you can be a part of the extra-solar planet discovery process. Here is a color-coded breakdown of the components in Discoverer of Worlds (see map on right): The Kepler Space Telescope is NASA's first mission capable of finding Earth-size planets around other worlds. The real spacecraft would not fit in my studio and is currently in a heliocentric orbit, so I brought in the hand-built model and overlaid it with solar cell scraps for a look like the genuine spacecraft. Discovering a planet with the transit method is a particularly low-reward science as only about 5% of star systems are in such a favorable alignment so as to have transits. To contend with this, the Kepler Space Telescope stares at an area of the sky in the Cygnus Constellation to avoid the Sun, point out of the ecliptic plane and include a large number of stars in the field of view. I captured this background of the Cygnus constellation by combining more than 8-hrs of exposure time at Humboldt State Park and used my HiMARC algorithms for processing. A bear nearly ate my dog in the process, but it turns out he was more interested in the berries. Red clouds of ionized Hydrogen gas form the emission nebulas, including the North American nebula (flipped for composition) while the Milky Way's dust lanes and stars dominate the frame. On June 5th 2012, among the rarest of predictable astrological events occurred as the planet Venus’ normal orbit brought it to pass between the Earth and the Sun. Kepler relies on such transits to occur light-years away and though unable to view the exoplanet directly, infers its presence by a tiny dip in the amount of light coming from its hosts star. Repeat transits are needed to insure it was not a giant space bug flying in front of the lens (among other things). This year astronomers were lucky to see the transit only eight years after the previous occurrence in 2004, but the world will have to wait until 2117 to see Venus pass before the Sun again. For me and the rest of the HiMARC team, the transit provided a great opportunity to test our Atmospheric Lensing technology using a narrowband (7 nm) etalon interferometer tuned to the hydrogen alpha emission line on a 90mm refractor (this means we were able to filter out most of the light spectrum and only photograph a narrow portion in which hydrogen emits light. This allows you to see the rich textures of hydrogen on the Sun’s surface that would otherwise be obscured.) Solar features including filaments, flares, sunspots and prominences are visible. More than 1 Terabyte of data were recorded to produce this unprecedented high-resolution result! Dr. Natalie Batalha, Co-investigator on the Kepler Mission and Professor of Physics & Astronomy at San Jose State University, unassumingly brings the Vogue element to the piece and poses in the studio as I explain she should be reaching for the next new world with her right hand. As a member of the Kepler team, Dr. Batalha is responsible for the selection of the more than 150,000 stars the spacecraft monitors and works closely with team members at Ames to identify viable planet candidates from Kepler photometry.
Final assembly, color, contrast and brightness adjustments made in Photoshop, nothing else. 48'' x 120'' framed canvas print (available in multiple formats).
Here are some shots from the production:
]]>
Finally a gap between conferences, Plasma UAV data collection and Physics In Vogue and a new moon to test out HiMARC Atmospheric Lensing on extended objects using a thermoelectrically cooled camera loaned by the good folks at Pumpkin! Mt. Lassen was on fire, so this time I had to head out to Humboldt State Park in Northern California to get to the dark skies. To combat dew problems that had ruined imaging sessions in Palo Alto, I got the latest dew-heaters from the always helpful OPT crew and tested all the imaging equipment beforehand to make sure there were not any unpleasant surprises.
I got there towards sunset and setup the equipment for a night of imaging. The surprises started cropping up a few hours later.
Among the many local visitors to our camp, a black bear decided to scope things out and nearly ate Misha.
Crisis averted and finally setting up the scopes. Thermoelectrically cooled camera somehow got moisture in it and formed ice-crystals over the CCD. I tried every trick in the book to remedy this with no luck. This was the beginning of many problems unfortunately. The dew heaters, despite being properly installed on the 14'' and at full power (measured), simply could not keep it dew free. This was a showstopper that persisted for four nights, despite my other efforts to minimize the dew situation. My Nikon lenses and 8'' imaging newtonian did work with the dew heaters so I was able to collect data with them, but at a huge loss to the project. I shall send out more updates on how I will remedy some of these issues before collecting more data.
Back at home - there is some useful data after all. Thanks to the new computer from NASA Ames - I started the processing on multiple images at once.
And some initial results:
More to come soon!
]]>Misha accepts the plaque on my behalf:
We’re very excited to now have the ability to produce our own large-scale prints using the Epson® Stylus Pro 9900 printer. The new equipment is not only capable of printing images 44 inches in height (and theoretically no limits on length—that’s right, it’s limitless) but it also gives us complete control over the quality of our final products.
And we have very high standards for quality, which is why we purchased this top-notch piece of machinery that is well-known to be the very best in its class. (I mean, this thing uses three different types of black ink!) It also prints at an unbelievable speed, probably more than twice as fast as previous generations of inkjet printers.
We’ll admit that installing the Stylus Pro was no small effort. We had to call on the help of four of our buffest friends, and only two thrown-out backs later we had it in place and plugged in. We’ve taken it on a couple test runs now and we have not been disappointed. All the details and colors of our sharpest gigapixel images are incredibly well recreated as they are on the computer screen.
We’ve got an inkling our clients will love the quality of this new printer as much as we do.
Ved in front of a print from the Epson® Stylus Pro 9900
]]>
Before testing the Arduplane autopilot on the Predator plasma UAV, we are sequentially testing each of its functions on a traditionally actuated Bixler glider. Today, we try out the loiter (maintain altitude given a radius of curvature to hover about) and waypoint navigation (go to predefined GPS coordinates) functions.
Here is the latest from Lake Lagunita:
]]>
Here is the latest from Lake Lagunita (video storage ran out towards the end):
]]>
The funds will be used to help pay for all the telescope equipment, imagers and software we need to help make HiMARC fly someday. We could not have gotten this far without the amazing support of our sponsors.
Posing in front of massive, unnecessary checks was not my idea, but made for amusing photos. I tried to recycle mine in vain.
Hardware includes the following:
Check out the software too:
http://code.google.com/p/ardupilot-mega/
Below are some data and video from our telemetry tests. Range and Tx/Rx rates are excellent (at least 1000 meters and 100kbps respectively). The pitot tubes gave us trouble so we do not have any data from them yet.
Non plasma-actuated glider with Arduplane board (antenna visible on top of fuselage) - first flight. That landing was actually quite soft. I just like cringing.
Telemetry and flight paths of subsequent flights:
Flight path over Lake Lagunita (Google Earth imagery)
Altitude (meters) vs. time index
Roll, pitch and yaw vs. time index
Vx, Vy and Vz versus time index
A longer flight:
General updates:
HiMARC ground testing and development are proceeding according to schedule. Weekly updates detailing our progress will be posted on this blog weekly. Each imaging session and update will be available through an RSS feed, while previous research notes, data and files will be posted here first.
Our primary goals remain unchanged from the SOW. This summer, we are collaborating with NASA Ames MDD for immediate development, design and testing of our ground telescope system. Our deliverables include high‐resolution images of the International Space Station and celestial targets beyond our existing results using Atmospheric Lensing (AL) and the equipment provided to us by the MDD. The optics team will use Zemax software and MATLAB to refine the asymmetric optical train and simulate DAPPER performance for subdiffraction super resolution. Completed CAD models will be delivered and prepared for mirror fabrication of future prototypes. The AL eddy selection mask will be simulated in Zemax and refined from its current lenticular lens design for a future prototype fabrication. If budget and time permit, we will fabricate and implement the AL eddy selection mask. The image processing team will deliver part of the software to command the HiMARC imaging system for capturing and processing images automatically. Our goal is to increase the autonomy of the AL image processing algorithms and decrease processing time by an order of magnitude.
Project schedule (recent):
17 JULY 2012, Stanford, CA – Invited talk on HiMARC at the Kavli Institute for Particle Astrophysics and Cosmology.
26 JULY 2012, Stanford, CA – First imaging session using 14’’ SCT and new PtGrey cameras with full USB 3.0 bandwidth. Targets included Saturn, Mars and the Moon. Discovered we have a severe data storage problem with only an 80GB SSD onboard a temporarily purchased Macbook Air (attempts to off load the data resulted in a bus bottleneck). Purchased and built a RAID 5 array with USB 3.0 interface for the next session to solve this.
01 AUG 2012, Stanford, CA – First imaging session with the new autoguider and RAID 5 USB 3.0 external array.
11-17 AUG 2012, Logan, Utah - HiMARC selected for student award presentation at SmallSat conference in Utah paper ref#- SSC12-VIII-1. Brian Mahlstedt and Ved Chirayath will be presenting new results from ground testing and the HiMARC CubeSat concept to a panel of judges for consideration in the Frank J. Redd student scholarship competition.
18-21 AUG 2012, Lassen National Park, CA – Four-night imaging marathon to test atmospheric lensing capability on extended objects with the 14’’ SCT and the FLI and PtGrey cameras from Pumpkin and NASA Ames MDD.
10-13 OCT 2012, Nagoya, Japan – HiMARC selected for presentation and publication for the UN/Japan NanoSat Symposium in Nagoya, Japan. Science paper due 15 August.
Acknowledgements:
We are very grateful to the NASA Ames Director, Dr. Pete Worden, Matt Daniels, Dr. David Korsmeyer, Chad Frost, Creon Levit, Andrea Nazzal and all the folks at the MDD for your tremendous support of our research and helping develop new tools for the astronomical and small satellite communities. The HiMARC team is also grateful for the continued support from Oceanside Photo and Telescope (OPT) and Pumpkin Inc. for equipment loans and expertise. We look forward to continued collaboration!
Equipment:
Orion awesome autoguider tested and functional on 14’’ SCT on CGEPRO mount.Ved purchased PixInsight for image processing and batch scripting for atmospheric lensing development – it has a powerful wavelet toolbox, image registration and integration tools that have already helped reduce the processing time. PtGrey 3.2 MP mono and 8.2 MP color both functioning as expected though the read noise on the 8.2 MP seems higher than on the PtGrey spec. Discovered we have to switch to Mode 7 or 8 to record region of interests at higher than native frame rates (>200 fps). Still waiting on the Macbook Pro acquisition process to test atmospheric lensing to its full capability (Thank you Matt and Andrea!). Zemax software license needed for optical simulations. Powerpack arrived for remote imaging sessions.
Imaging:
26 July and 01 August session notes - Autoguider performed well, but there is no mounting kit to connect to the 14’’ SCT. Ashish and Lorenzo are making one for the next session. We tested the guider through the main scope and got stable tracking working using PHD autoguider software. Calibration data and periodic error measurements will be taken for future imaging sessions and looked fine this time around. RAID 5 USB 3.0 array has a driver issue and cannot function at USB 3.0 bandwidths on the Macbook Air in Windows (mac side is fine, but not compatible with image capture software). We are waiting on the Macbook Pro for larger onboard SSD storage.
Great control data acquired for Copernicus crater – still processing files now and should have new results shortly.
Theory & Paper:
UN/Japan NanoSat Symposium scientific paper due 15 August (will be forwarded along after submission and shared on the blog). There really should be three separate papers published – on the HiMARC design innovation, atmospheric lensing and DAPPER (asymmetrically enhanced maximum entropy deconvolution). I am currently working with a group of physicists in Varian on the latter and a mathematical proof and the Japan NanoSat paper includes the other two.
A toy demonstration of an unoptimized asymmetric pupil and corresponding diffraction pattern (courtesy Ashish):
]]>
Lorenzo's write-up:
HARDWARE
Autopilot and accessories
After receiving the part for the autopilot, I went on to assemble them and mount them together on a ready-to-fly frame. Problems were experienced with the x-bee wireless telemetry kit, in particular the flight side needed to be revived.
SOFTWARE
Mission planner
The purchased autopilot comes accessorised with an extremely flexible software that allows you to easily plan any type of mission, as well as configure the parameters to flight a plane in auto mode
Waypoint
Planning a mission can be done rather easily: on the mission planning software there's an utility that allows you to select, through a click and point interface, points (with their relative altitude) on a map and have the airplane fly through them. A second option is to write a script with a detail specification of the desired mission, by doing so it's possible to plan a wide array of options beside the waypoint location; a select few are: throttle power, speed, pitch-roll-yawn angles, rate of climb, maximum distance.
Configuration
The same philosophy of the mission planner is used for configuring the autopilot to command the plane. As before it can be done through a user-friendly interface or by manually changing the C file. In the mission planner software, under configuration one can set
TESTING
Ground Testing
After completing the initial setup and having familiarized myself with the mission planner I carried out a test on the wireless telemetry kit.
Testing has been executed around Stanford campus, the screen-shot above (right side) shows part of the testing itinerary along with the collected telemetry
Simulation
Due to the cost of the autopilot, along with its fragility, before field testing it I have carried out simulation to assess the full capability of the board. This has been done through a software in the loop simulation by means of the X-Plane10 flight simulator.
In these simulations the capabilities of the board have been extensively test but I have incurred in three minor drawbacks: time, frame availability, take-off. The first two are due to the software available: because of its demo nature it comes with a limited usage time (10 min per session) and with a single RC plane preloaded.
In situ
After having tested the capabilities of the autopilot on a simulated environment I moved on to test it on a pre-owned plane. Firstly I tested the correct functioning of the movable surfaces on ground, following this first test I moved on to examine its ability to correctly level the plane when flying.
Finally I set the loiter function on the flight planner to have the plane circle around a fixed location at a given altitude.
]]>
What’s so special about an annular solar eclipse? Unlike a total eclipse—where the Sun is eventually blocked out entirely by the moon—with an annular eclipse the moon appears slightly smaller and never fully covers the Sun, even when the two are completely aligned. Such an event usually occurs with striking visual effect: in the moment of perfect alignment the Sun looks like a glowing golden ring with a hollowed-out core.
Ved and the HiMARC crew travelled to a park in Northern California in order to be along the path of the annular eclipse
On May 20th, 2012, Ved and the HiMARC team set out for Lassen Peak in Northern California where they knew the path of the eclipse would run. By setting up their telescopic equipment on this location they could assure that their images of the event would turn out like few others’—only along this path would the solar ring surrounding the moon appear with equal thickness.
To capture the perfect image, Ved used a hydrogen-alpha bandpass filter, through which the details of the swirling hydrogen surface of the Sun is most visible. And for each phase of the eclipse, nine different shots were made with varying levels of exposure. By doing this, he could then digitally process the nine photographs, ultimately creating a single high dynamic range composite image.
You can see photographs from the HiMARC team's trip to Mt. Lassen below, as well as additional high dynamic range images of the eclipse.
]]>
The ruins of the ancient Mayan city, Uxmal, go from impressive to unbelievable when you consider that these large stone structures were constructed without the use of wheels. It makes the myth surrounding the Pyramid of the Magician—that the structure was conjured by a dwarf over a single night—seem almost comparably plausible.
Today the ancient city is well preserved, though photography has been greatly limited by the Mexican government. Capturing this panoramic, high dynamic range shot of the ruins was a very unique opportunity, and no other known photograph of such resolution and quality now exists. The photograph represents a stitched composite of over 5000 shots projected on an equirectangular plane, allowing you to not only appreciate the full 360° experience, but also enjoy the many fine details of the carved stone.
To see more about the making of this image, click here.
]]>
What does the world look like from the top of the ancient Mayan pyramid of Oxkintok? Well, something like this. Thousands of images were captured and stitched together to create this stereographic projection of the full view from up top.
In the past few years, more and more ruins like Oxkintok have been discovered after centuries of being covered by thick jungle foliage. Assisted by high resolution satellite imaging, archeologists have been able to pinpoint locations where vegetation has grown up over limestone due to its slightly different coloring only discernable through multi-spectral imaging.
To see more about the making of this image, click here.
]]>
18 kilometers south of the more famous Uxmal, via an anciently constructed pedestrian causeway, are the ruins of Kabah. The buildings here can be easily identified as distinct among Mayan architecture for the repetition of a single pattern along the façade. These tiles of carved stone have been extremely well preserved for over 1000 years and are well documented in this high-resolution panoramic image of one of the city’s buildings.
Restrictions on photographing Mayan ruins by the Mexican government have made large panoramas such as this very rare. This image is an HDR composite of over 5000 individual shots projected onto a equirectangular plane.
To see more about the making of this image, click here.
]]>
In a traditional Kenyan Masai village, cattle farming is not only at the center of their way of life but also literally at the center of where they live. For the sake of protecting their animals from wild predators, individual family huts are built to tightly encircle their livestock field.
Ved came to the Masai Mara village when doing a photo shoot for the nonprofit, Samasource. He traveled to the location over night so that he could capture the setting right at sunrise when the quality of light was at its very best, and the mud fields would glow a golden red.
In this photo, Ved used HDR imaging in connection with stereographic projection of over 5000 shots stitched into a composite image. Look in every direction to see the way Masai homes surround the cattle field.
To see more about the making of this image, click here.
]]>
The stony face of the Ankarana Reserve is uniquely and immediately recognizable. It boasts gigantic limestone massifs worn and shaped by the heavy Madagascar rains over hundreds of millions of years. Throughout the reserve the erosion has created large landscapes filled with pointy pinnacles known as “tsingy.” The term comes from the local Malagasy word for tiptoeing, evoking the way one might traverse the terrain by hopping from one stone spire to the next.
In this image, a full 360 degree panorama of the Ankarana terrain is in view. Vast fields of the tsingy stretch as far as the eye can see, while the foreground displays the furrowed limestone textures in fine detail. Look close and you can make out fossil shells within the stone.
To capture this shot, Ved waded nearly a half kilometer through the shallow shelf of Ramena beach to a point where he set up his equipment in neck-deep water. His effort proved to be worthwhile when he captured this image of the fishing village’s white sand beaches set against the pristine emerald waters.
The area surrounding Ramena beach is known as Les Trois Baies, or the three bays, at the northern-most tip of Madagascar. It lies near the point where the Mozambique Channel meets the Indian Ocean.
This panoramic shot represents a composite of over 5000 shots taken from one position as the camera rotated a full 360°. This synthesis approach allows the image to provide incredibly clear views in every direction.
To see more about the making of this image, click here.
Twelve months ago one of the most iconic 20th-Century figures—the Space Shuttle—was finally retired. Apart from its legacy of science and exploration, the Space Shuttle inspired generations of children who grew up with the image of it arcing upwards into the blue sky atop its orange external fuel tank and solid-rocket booster.
With the shuttle program abandoned, the Vehicle Assembly Building remains as a monument to this legendary 30 year run. The world’s largest single-story building, its sheer massiveness alone is impressive. But vastly more important, this is the workshop where the world’s most technologically sophisticated vehicles were assembled; where precision, invention and ambition led to ever greater scientific feats.
These views of the VAB were captured using a combination of processes including high dynamic range imaging as well as stereographic projection.
To see more about the making of this image, click here.
]]>
After a successful mission, after the exterior ceramic tiles were largely charred from the crucible of re-entry into Earth’s atmosphere, NASA’s space shuttles were hardly discarded. These space vehicles were engineered for reuse. But with over a million moving parts, each fundamental to the success and safety of the crew, the process of maintenance and preparation for each new flight was an extremely daunting task.
It was within the Orbiter Processing Facility that painstaking effort was put into checking each part of the shuttle’s systems, ensuring that the space craft was ready to safely fly again. On average, three months of exhaustive inspection would go into each shuttle, during which time worn and defective parts would be replaced. The recently retired space shuttle Discovery completed 39 successful missions, attesting to the excellence achieved in this hangar.
This image was captured from above the facility floor, using a composite of over 5000 photographs shot in every direction.
To see more about the making of this photo, click here.
]]>
A foothill trail, known as “The Dish,” runs behind the Stanford University campus, well-beloved locally for its breathtaking views spotted with gigantic radio telescopes. Though the trail can get steep, runners and walkers are quickly hooked by the otherworldly beauty up top.
This image provides an HDR composite panorama of the Northwestern facing view from the trail. To see more about the making of this image, click here.
]]>