Thursday, 28 February 2013

D mesons flipping

The magnet at the LHCb detector

A result today from the LHCb had me humming:
"On the 18th day of shut-down, my true love sent to me: D mesons flipping".

To put this in English, the Large Hadron Collider's LHCb detector has published highly significant results indicating that they have detected particles called D mesons, oscillating from matter into antimatter. The results come 18 days after the LHC shut down for maintenance.

Their paper, pre-published on arXiv, outlines how the D mesons, the last of the four types of mesons to be 'observed' undergoing this oscillation, were detected to a five-sigma level of certainty.

As science writer Jason Palmer puts it:
"In the complicated zoo of subatomic physics, particles routinely decay into other particles, or spontaneously change from a matter type to their antimatter counterparts. This "oscillation" forms an important part of the theory that attempts to tame the zoo - the Standard Model. Mesons are part of a large family of particles made up of the fundamental particles known as quarks. The protons and neutrons at the centres of the atoms of matter we know well are each made up of three such quarks.
Mesons, on the other hand, are made of just two - specifically one quark and one antimatter quark. Theory holds that four members of the meson family can undergo the matter-antimatter oscillation - the matter and antimatter quarks both flip to their opposites."

The LHCb, which has had a series of stunning results with B mesons, had observed two types of B mesons and a K meson oscillating between matter and antimatter before.  But with this new paper, the team provide evidence that the last of the four types of particles, D mesons, has now been detected undertaking the same type of oscillation.

As Chris Parkes, LHC researcher from University of Manchester said:
"This is a nice moment, it's a sort of completeness.”

It is striking to note that the abstract on arXiv lists 60 authors, with another 550 not cited, due to lack of space. This really underscores the collaborative nature of physics at the Large Hadron Collider.

The results are significant because the LHCb is investigating the unsolved question of why there is more matter than antimatter in the Universe. According to the Standard Model, particles of matter and antimatter come in pairs, and matter and antimatter should obey the same laws.  Therefore, we ought to expect an equal amount of both.  So the question, "where's all the antimatter?" has had physicists scratching their heads for some time. The team at the LHCb, are undertaking some of the most significant and fundamental work to try and answer this question.

LHCb's result comes through two and half weeks after the LHC was shut-down for a lengthy period of maintenance.  But it emphasises just how much science will be carrying on whilst the main detectors are serviced and upgraded. The data collected by researchers during the first phase of collisions will be pored over for years to come, and we should expect more fascinating results like these, in the coming months.

Source: http://arxiv.org/abs/1211.1230
http://www.bbc.co.uk/news/science-environment-21594357

Monday, 25 February 2013

Ensonifying space


It is very heartening and interesting to read so many fascinating articles, emerging from my Tuning into the Universe piece for Huffington Post this weekend.

Scientists and journalists from Huffington Post community have published a range of pieces on everything from data sonification, to astereoseismology, to the reminiscences of a former astronaut. Together these articles greatly expand the field of general knowledge around the physics of radio astronomy, and our capacity to sensorially experience it.

One of the pieces draws on an interview with radio astronomer, and the co-founder of the SETI Institute, Jill Tarter. Amplifying the central message of  Tuning into the Universe, Tarter notes that:
"when SETI listens to the cosmos, the institute is actually receiving electromagnetic radiation. And then, just the way your radio does, that energy can be used to make audible sound."

The pieces published in response to the article extend, expand and ensonify this notion.  Some of my favourites include:

The Sound of the Deep Sea of Space by radio astronomer, Dr. Tyler Nordgren equates the universe with a vast ocean, echoing Carl Sagan's famous analogy from his series, Cosmos. He poetically maps out the methods of astronomical observation available to modern astronomers, beyond the detection of visible light. He notes: "as a young radio astronomer I learned early on that every time human beings have explored the world with new senses we have discovered new and amazing phenomena".

Voices Carry by Anna Leahy and Douglas Dechow explores the sonic signature of our own planet:
"The sound of the Earth's inherent dynamics -- the movement of atmosphere and oceans -- produces a steady drone as well. Lightning produces crackling, which scientists call sferics."

Voyager Golden Record
The article includes a memorable passage about the Voyager Golden Record, which contains 'greetings in 56 languages, natural sounds like thunder and crickets chirping, and music from around the world', encoded in audio and now travelling towards the outer reaches of our solar system on board Voyager.

In An Audible Tour of the Solar System? Sign Me Up!, astronomer and planetary scientist, Jim Bell analyses our celestial neighbourhood, exploring the potential for acoustic sound on each of our nearest planets. The Perfect Quiet of Space by legendary astronaut, Jerry L. Ross, is the extraordinary account of his nine spacewalks, undertaken during his seven missions into space.

Jerry L. Ross n one of his nine spacewalks.

He writes eloquently about the silence which astronauts experience, when outside the International Space Station:
"Without the sophisticated listening devices scientists use on earth to hear the whispers of the universe, to an astronaut space is infinite quiet, a place where we bring the only sounds that break the silence."

Sound: The Music of the Universe by Mark Ballora and George Smoot III is an excellent overview of the practice of data sonification, which takes in in the brilliant work of the xSonify team, who are making sonification applications for blind scientists. The article also refers to the emerging science of astereoseismology and exoseismology, which I talked about last Friday in my Sonic Acts talk.

They clearly explain why data sonification methods can be useful:
"Symbolic renderings create other perspectives. Literal renderings are not always compatible with the capabilities of our auditory system. When data points are treated as audio samples and played back at audio rates (typically at 44100 values/second) quick changes are lost to us, as we can't hear fluctuations discretely at the millisecond level. If, instead, we treat the data points symbolically, for example as pitches, we are better able to "magnify" what we are listening to."

In Understanding the Sound of Space, Ayodele Faiyetole notes that sound is under used in science.  He draws on an interview with cosmologist, Yuko Takahashi, who believes there's a great value in presenting scientific results in a totally different dimensions, such as sound:
"Maps of CMB anisotropy can be converted to sound as a telescope sweeps across the sky to give the audience a better appreciation of the fluctuations."

As Ballora and Smoot put it, "if the universe is, at some level, music, then it seems only natural that we should study it with musical tools of thinking."

Saturday, 23 February 2013

Tuning into the sound of the universe with radio




This weekend, Huffington Post have published my piece, Tuning the Universe, which contextualses my TED talk, which they are featuring as part of TED Weekends.

The piece provides some background into the audified radio waves which I played during my talk. Here's the gist of the article:

"We have been surrounded by stunning portrayals of our own solar system and beyond for generations, in books, on film and on television. But in popular culture, we have no sense of what space sounds like.  And indeed, most people associate space with silence.
There are, of course, perfectly valid scientific reasons for assuming so. Space is a vacuum. Sounds cannot propagate in a vacuum.  But through the intervention of radio, it is possible for us to listen to the Sun's fizzling solar flares, the roaring waves and spitting fire of Jupiter's stormy interactions with its moon Io, pulsars' metronomic beats, or the eerie melodic shimmer of a whistler in the magnetosphere."

My talk, and my work in this area, emerges from the science of radio astronomy.


RT16 at the Ventspils International Radio Astronomy Centre. Latvia
Whilst optical astronomers use telescopes to look at the visible light emitted by stars, radio astronomers use radio telescopes, or antennae, to detect radio waves. By combining radio astronomy with radio and sound engineering, we can hear as well as see the stars, and thus greatly expand our sensory perception of our cosmos.

It is important to remember that stars and planets are not directly audible. The recordings I played in my talk are radio waves which have been converted into sound waves using radio receives and amplifiers.  This is a process I refer to as audification. Huffington Post have also published two companion pieces which respond to the talk, the first of which emphasises this point.  Celestial Sound Effects by Seth Shostak notes correctly that, "they're electromagnetic noise, converted by electronic devices ... into signals that - when played through a loudspeaker - become the atmospheric pressure waves we call sound."

The second piece is What Is the Color of the Universe? by Mario Livio, which uses Karl Glazebrook and Ivan Baldry survey of more than 200,000 galaxies (the 2dF Galaxy Redshift Survey) as a basis for examining the colour of the universe.

Thanks to Huffington Post and Janet Lee at TED for publishing the piece.

And here's the talk in full:

Tuesday, 5 February 2013

New Zealand recognised as major contributor to radio astronomy history

John Bolton (left) and New Zealander Gordan Stanley (centre), pictured with Jow Pawsey
New Zealand has claimed its place in radio astronomy history. As reported here a year ago, New Zealand has significant scientific heritage in the field of radio astronomy, and has begun to explore and celebrate this history.

Last week, some of the biggest names in the field gathered for an international conference which marked New Zealand's role in helping to kick-start radio astronomy research in the 1940s.  Attended by the doyenne of the field, Jocelyn Bell Burnell, and researchers and historians from New Zealand, Australia and the UK, the conference explored the work of John Bolton and New Zealander, Gordon Stanley, who detected radio waves from outside the solar system in August 1948 from sites in Pakiri and Piha in the North Island of New Zealand.


Elizabeth Alexander
The conference also commemorated the pioneering work of Elizabeth Alexander, often referred to as the first female radio astronomer, who helped helped establish some of the early foundations of solar radio astronomy in 1946. Alexander studied sources of interference effecting radar stations in New Zealand established during World War II.  During March-April 1945, solar radio emission was detected at 200 MHz by operators of a Royal New Zealand Air Force radar unit located on Norfolk Island.
The emissions became known as the "Norfolk Island effect". Alexander, then based at the Department of Scientific and Industrial Research in Wellington, heading up the Operational Research Section of the Radio Development Laboratory, carried out the most significant early work on the effect throughout 1945. In 1946. she published a paper in the journal, Radio & Electronics describing the emissions, and in doing so, furthered the fledgling field of radio astronomy.

Wayne Orchiston, writing in "The New Astronomy", has noted that Alexander's research also led to further solar radio astronomy projects in New Zealand in the immediate post-war year, and in part was responsible for the launch of the radio astronomy program at the CSIRO, in Australia."

Radar Station (Whangaroa) - one of five involved in New Zealand's investigation of solar radio emission. Image courtesy of Wayne Orchiston.
Astronomer Miller Goss, from the National Astronomy Observatory in New Mexico, puts it:
"Bolton and Stanley's discovery revolutionised twentieth century astronomy."

Following their pioneering discoveries, Bolton went on to become a major figure in Australian radio astronomy, helping found the famous Parkes radio telescope, becoming director of the Australian National Radio Astronomy Observatory and winning the the inaugural Jansky Prize in 1966 (so named after the father of radio astronomy, Karl Jansky).

Sergei Gulyaev
 The conference was organised by the extraordinary Sergei Gulyaev, who has revitalised radio astronomy in New Zealand, spearheading the nation's participation in the SKA, amongst many other efforts.

Sunday, 20 January 2013

Can particle physics transform our online experience?



One of the more intriguing papers we've happened upon early this year is some research from the University of Fribourg which suggests that particle physics can improve the technology behind recommendation systems.

Recommendation systems are software which websites such as Amazon and Facebook use to tailor information for users.  When Amazon, somewhat eerily, suggests that you check out a book by a writer you've been just been thinking about, that's a recommendation system, or "engine", in action. Recommendation engines are at the heart of online business, as the good ones are known to generate profits.  Improving them is a key goal for retailers and software companies alike.

But what on earth does particle physics have to do with software like that?  Well, a group of researchers at the at the University of Fribourg in Switzerland believe that the physics which governs the behaviour of photons and electrons may also be used to optimise recommendation engines.

Stanislao Gualdi, Matus Medo, and Yi-Cheng Zhang this month published an abstract on arXiv entitled, "Crowd Avoidance and Diversity in Socio-Economic Systems and Recommendation".  The paper's key insight is that the problem with recommendation engines is that they can lead to 'overcrowding' around a specific product or service, which can be detrimental to the experience users have with it.  Surges in demand can be sometimes advantageous, but if the value of a resource diminishes as more people use it, then this creates a problem.  A good example would be a service like Netflix recommending the same movie to too many of its users, thereby creating long waiting times for everyone. Another good example, would be a travel website recommending a beach or a picnic spot because it is quiet. As Technology Review noted "this can end up destroying the peace that gives it value. Similarly, restaurant recommendations can lead to overcrowding or difficulty getting a table which again makes the dining experience unpleasant."

These examples show how over-stimulated demand can reduce the value of a resource. Gualdi, Medo and Zhang set about to tackle this issue.  They applied the logic employed in particle physics, where particles tend to occupy the most energetically favourable states. Technology Review explain:
"If the particles are bosons, such as photons, there is no limit to the number that can occupy a given state. But if they are fermions, like electrons, their physical properties dictate that no two can occupy the same state. Clearly the resulting distribution of these different types of particles is entirely different.The analogy here is with goods that any number of people can share or that only one person can have."

The latter case - products or experiences that are best experienced by small groups, or even individuals - provides a dilemma for recommendation engines. In order for these things to retain their value, they need to remain available only to limited umbers of people. Gualdi, Medo and Zhang insist that the principle of 'crowd avoidance' needs to be employed to avoid oversubscription, crowding and disappointment.

They provide evidence in their paper that building crowd avoidance into the recommendation process can increase the accuracy of the recommendation, and therefore the potential profitability of the recommendation engine. As they put it:
"We use real data to show that contrary to expectations, the introduction of these constraints enhances recommendation accuracy and diversity even in systems where overcrowding is not detrimental. The observed accuracy improvements are explained in terms of removing potential bias of the recommendation method."

It's a fascinating and quirky approach, and if they are right, and their technique is employed by the software developers who design and build our online world, it might just transform what is recommended to us, and when, and how we experience it.

Source: http://arxiv.org/abs/1301.1887

It's full of stars


This week the noted photographer and optics engineer, Stéphane Guisard posted a stunning mosaic image of the Milky Way captured at ESO's Paranal Observatory in Chile.

The mosaic is constructed from 52 fields, shot over 29 nights, and consists of 1200 separate photos and 1 billion pixels. The image transports us to the centre of the Milky Way, giving us a scalable and zoomable encounter with literally millions of stars.

As Guisard notes, "it shows the region spanning from Sagittarius (with the Milky Way center and M8/M20 area on the left) to Scorpius (with colorful Antares and Rho Ophiuchus region on the right) and Cat Paw nebula (red nebula at the bottom)."

This is the galaxy in extreme fidelity. Enjoy getting lost in the detail.

Source: http://sguisard.astrosurf.com/Pagim/GC.html

Sunday, 2 December 2012

Did Einstein discover dark energy?


One of the more interesting things that popped up on arXiv this week was the quirkily titled abstract, How Einstein Discovered Dark Energy, submitted by Alex Harvey, Visiting Scholar at New York University on 22 November.

It bears repeating in its entirety:

"In 1917 Einstein published his Cosmological Considerations Concerning the General Theory of Relativity. In it was the first use of the cosmological constant. Shortly thereafter Schrodinger presented a note providing a solution to these same equations with the cosmological constant term transposed to the right hand side thus making it part of the stress-energy tensor. Einstein commented that if Schrodinger had something more than a mere mathematical convenience in mind he should describe its properties. Then Einstein detailed what some of these properties might be. In so doing, he gave the first description of Dark Energy. We present a translation of Schrodinger's paper and Einstein's response."
The full paper and references are downloadable here.


It will be interesting to hear what the responses are to this.

Source: http://arxiv.org/abs/1211.6338

Tuesday, 27 November 2012

The rise of the algorithms

The Knife - a financial trading algorithm

This week Tarleton Gillespie was the latest critical commentator to analyse the role that algorithms play in contemporary life. His detailed insightful and urgent essay, The Relevance of Algorithms was published on the Culture Digitally blog this week, ahead of it's forthcoming publication in the MIT book, "Media Technologies".



Picking up where Kevin Slavin's excoriating Lift talk, "Those algorithms that govern our lives" (later reprised for TED) left off, Gillespie conducts a thorough investigation into how algorithms provide the basis for a great deal of our individual and societal choices. That we understand their impact on our daily lives so poorly is cause for great concern, Gillespie argues. He notes a particular anxiety with the way that algorithms are starting to influence how we find and interpret information, and points to the obvious impact this will have on politics:

"Algorithms not only help us find information, they provide a means to know what there is to know and how to know it, to participate in social and political discourse."

This has strong and relevant echoes back to the set of concerns raised by both designers and artists working with technology, as alluded to in a previous post. Artists Julian Oliver, Danja Vasiliev and Gordan Savicic have developed a discourse they refer to as "critical engineering", which aims to expose the systems, mechanisms, languages and logics which make up our engineered world. This is urgent, political work, they argue, as the encroachment of engineering into our lives, is matched only by its increasing invisibility. If we lose our ability to perceive this technological infrastructure, we lose agency.

As Oliver wrote recently:
"As thinkers with technical abilities in several areas, we want to take on our built and increasingly automated environment [...] If there's ever a time to be doing that, it's now, especially with opaque and hidden infrastructure in the telecommunications space deeply impacting diplomatic relations and civil liberties world wide. [...] Our inability to describe and understand technological infrastructure reduces our critical reach, leaving us both disempowered and, quite often, vulnerable." - Julian Oliver (September 2012)

These thoughts are echoed almost precisely by designer, writer and publisher, James Bridle (who's work in this area was referenced here recently), who notes:
"By legibility I mean our own ability to read these systems, how much they can affect the way we see and act in the world, and the differing positions of power we have in the world based on how legible those systems are. [...] Programmers have a huge amount of agency in the world, because they can deconstruct, reverse engineer and write and construct and create these systems. People who can't, don't, and they have less power in the world because of it." - James Bridle (September 2012). He later wrote:
"Those who cannot perceive the network cannot act effectively within it, and are powerless. The job, then, is to make such things visible."

Gillespie's essay operates very much within this spirit, insisting on the need to be able to perceive and understand the way that algorithms are becoming part of our lived environment. He writes:
"What we need is an interrogation of algorithms as a key feature of our information ecosystem, and of the cultural forms emerging in their shadows, with a close attention to where and in what ways the introduction of algorithms into human knowledge practices may have political ramifications."

His essay then seeks to do just that, providing an excellent map for this emerging terrain.  His perspective is not technological, but rather sociological, an analysis which he insists "must not conceive of algorithms as abstract, technical achievements, but must unpack the warm human and institutional choices that lie behind these cold mechanisms."

His essay is a vital insight into these choices.  Resonating with the worlds of both Oliver and Bridle, he concludes:
"In many ways, algorithms remain outside our grasp, and they are designed to be. This is not to say that we should not aspire to illuminate their workings and impact. We should. But we may also need to prepare ourselves for more and more encounters with the unexpected and ineffable associations they will sometimes draw for us, the fundamental uncertainty about who we are speaking to or hearing, and the palpable but opaque undercurrents that move quietly beneath knowledge when it is managed by algorithms.

Sources:
http://culturedigitally.org/2012/11/the-relevance-of-algorithms/
http://www.tarletongillespie.org/

Sunday, 18 November 2012

Your move, theorists

B-mesons' decay products

Physicists rather enjoy the friendly rivalry between theorists and experimentalists, and this week has been a fascinating week for both. Results presented at the Hadron Collider Physics symposium in Kyoto this week have proved a triumph for experimentalists working with the most powerful tool at their disposal - the LHC. But theorists have been left scratching their heads, as one of the most prominent theories of "new physics" took a major hit. Since the search for physics beyond the Standard Model is a leading priority in particle physics, this is highly significant for everyone.

Unsurprisingly, the hot topic of discussion at the Hadron Collider Physics symposium was the LHC's incendiary boson results from earlier this year.

Both the teams from the CMS detector and the ATLAS detector presented new analyses of the observations which led them to announce the discovery of a Higgs-like boson in July.  What's striking about the new data is that is backs up initial suspicions that the boson discovered at the LHC appears to be behaving precisely as the Standard Model predicted it would. As Tommaso Dorigo noted in Quantum Diaries, the new measurements, "confirm the standard model interpretation of the new found object."  Philip Gibbs at viXra provides some further technical analysis of the results here.

The LHCb Experiment. Photo courtesy of CERN.

But perhaps more sensational was the new results presented by their colleagues over at the LHCb Experiment. Johannes Albrecht reported that the LHCb team have observed one of the rarest particle decay events in physics, a Bs meson decaying into 2 muons. These events are so rare that the Standard Model predicts they should only occur about once in 300 million collisions.

That LHCb has observed one of these events at all is a stunning achievement for the experimentalists working on the detector. But the fact that their results suggest the the Bs meson decay is every bit as rare as the Standard Model predicted it should be, is a serious blow for one of the leading theories of new physics - supersymmetry.

The theory of supersymmetry, often referred to by its shortened nickname, SUSY, states that every fundamental matter particle should have a more massive, or 'super' force carrier particle, and every force carrier should have a 'super' matter particle.  These particles are often referred to as 'sparticles' (supersymmetric particles). Supersymmetry. has been championed by theorists such as Savas Dimopoulos and Gordon Kane, who memorably described the theory as a "wonderful, beautiful and unique" solution for the problems in our understanding of the subatomic world.

However, the LHCb results have cast serious doubt on the viability of supersymmetry as a theory. Supersymmetry predicts that if superparticles exist the Bs meson decay to a pair of muons should occur far more often than one in 300 million.  The work the LHCb team were doing has long been considered the most important experimental test for supersymmetry.  Whilst the LHCb results, which prove the rarity of the decay, don't rule supersymmetry out all together, the parameters for superparticles have narrowed dramatically, making the theory a much less likely explanation for the mysteries in our subatomic world, than many had hoped.

Why is this significant? Well, all physicists know that the Standard Model, despite its elegance, does not function as a complete explanation for the forces which govern our universe. It provides little explanation for gravity, and it noticeably fails to explain either dark energy or dark matter. Given that it is believed that dark matter may constitute up to 84% of all matter in the universe, and dark energy up to 73% of all the known energy in the universe, a theory which explains neither is clearly inadequate.

The science community at large had been hoping that the experiments running at the LHC would start to uncover evidence for "new physics" beyond the Standard Model, which would begin to explain these puzzling features of the universe.  But so far, not only have the main results not done so, they've simply provided ever-strengthening evidence for the veracity of the Standard Model

As Marc-Olivier Bettler from LHCb noted this week, "if new physics is present then it is hiding very well behind the Standard Model".

A typical candidate event for the Higgs boson measured in the CMS electromagnetic calorimeter. Image courtesy of CERN

The fact that CMS and ATLAS this week seemed to be describing a Higgs boson which looks awfully like the one predicted by the Standard Model, is compounding theoretical concern. Expressing this eloquently this week, Guido Altarelli from CERN stated that a Standard Model Higgs was, "a toy model to make the theory match the data, a crutch to allow the Standard Model to walk a bit further until something better comes along."

As Matthew Chalmers noted in an article which starkly set out the challenges the experimental results are raising, a Higgs boson at 125 GeV (the measurements both CMS and ATLAS have provided further evidence for this week) not only has a mass "vastly less than it should be, it is also about as small as it can possibly be without dragging the universe into another catastrophic transition. If it were just a few GeV lighter, the strength of the Higgs interactions would change in such a way that the lowest energy state of the vacuum would dip below zero. The universe could then at some surprise moment "tunnel" into this bizarre state, again instantly changing the entire configuration of the particles and forces and obliterating structures such as atoms."
He dramatically intoned, "as things stand, the universe is seemingly teetering on the cusp of eternal stability and total ruin."

All of this is to say that whilst results presented by ATLAS, CMS, and the LHCb are bringing relief in some quarters, as they certainly prove how exceptional the LHC is as a tool of discovery, they are causing some deep unease amongst theorists.

These are exciting times.

As particle physicist Ben Still observed earlier this year, "until theorists can come up with ways we can test their theories, they are just dealing with works of fiction."

So after some cracking moves this week in which the experimentalists have put pay to some of the most treasured literary works of physics theory, the ball is now back in the court of the theorists. They need to dream up new theories which help make sense of these results, and suggest new routes forward.

Sources:
http://www.nature.com/
http://www.quantumdiaries.org/

Saturday, 10 November 2012

A drones-eye-view: revealing the killing fields

Jaar, Yemen, October 18 2012 / 7-9 killed. Image from Dronestagram by James Bridle

The unmanned aerial vehicle (UAV), or drone, has become one of the most potent weapons of contemporary warfare. Remotely controlled by operators thousands of miles away from the theatre of war, drones carry out aerial attacks which leave hundreds of people dead. The increasing amount of 'collateral damage' from US drone strikes on the Pakistan-Afghanistan border recently lead prominent politician, Imran Khan, to lead a high-profile protest against their use.

Drone Vision by Trevor Paglen

Artists have been actively documenting the impact of the use of drones in warfare for some years now.   Trevor Paglen's Drone Vision, recently on show at Lighthouse in Brighton, provides us with a chilling "drones-eye-view" of a landscape, enabling us to see what drone-operators see.


Five Thousand Feet is the Best by Omer Fast

The utterly compelling and disturbing film installation, Five Thousand Feet is the Best by Israeli artist Omer Fast tells the story of a former Predator drone operator, recalling his experience of using drones to fire at civilians and militia in Afghanistan and Pakistan.  At one stage of the film, he describes the use of what marines refer to as "the light of god", the laser targeting marker, which is used to direct hellfire missiles to their intended target.

"We call it in, and we're given all the clearances that are necessary, all the approvals and everything else, and then we do something called the Light of God - the Marines like to call it the Light of God. It's a laser targeting marker. We just send out a beam of laser and when the troops put on their night vision goggles they'll just see this light that looks like it's coming from heaven. Right on the spot, coming out of nowhere, from the sky. It's quite beautiful." (quoted from Five Thousand Feet is the Best).

The Light of God by James Bridle

Writer, publisher, web developer and artist, James Bridle responded to this by creating his own work, The Light of God.

Sharing Paglen and Fast's concern with the use of drones in warfare, Bridle has crated a series of projects which attempt to reveal their presence in the landscape.  His Drone Shadow interventions are one-to-one representations of the MQ-1 Predator Unmanned Aerial Vehicle (UAV) drawn to scale within urban landscapes. The first was drawn in London this February (in collaboration with Einar Sneve Martinussen), and the second in Turkey this October as part of the Istanbul Design Biennial.

Drone Shadow 002 by James Bridle

Like Paglen and Fast, Bridle's work stems from a deep concern with increasingly invisible and seamless military technologies that are creating the context for "secret, unaccountable, endless wars".

Bridle writes, "the drone also, for me, stands in part for the network itself: an invisible, inherently connected technology allowing sight and action at a distance. Us and the digital, acting together, a medium and an exchange. But the non-human components of the network are not moral actors, and the same technology that permits civilian technological wonder, the wide-eyed futurism of the New Aesthetic and the unevenly-distributed joy of living now, also produces obscurantist "security" culture, ubiquitous surveillance, and robotic killing machines. [....] We all live under the shadow of the drone, although most of us are lucky enough not to live under its direct fire. But the attitude they represent - of technology used for obscuration and violence; of the obfuscation of morality and culpability; of the illusion of omniscience and omnipotence; of the lesser value of other peoples lives; of, frankly, endless war - should concern us all."

His latest work, released yesterday, is Dronestagram. Bridle has been collecting images of the locations of drone strikes, and sharing these photographs on the photo-sharing site Instagram. His intention is to make these locations more visible, bringing them closer to us, and in the process perhaps making the reality of the daily occurrence of deadly drone strikes more tangible.

He utilises public records from the Bureau of Investigative Journalism who document strikes as they happen in Pakistan, Yemen or Somalia. After confirming the location of a strike, he then uses Google Maps to create a satellite image of the targeted location.  The image, accompanied by a description of the site, and the death-toll, if known, is uploaded to Instagram.


Wadi Abu Jabara, Yemen, 28 October 2012. 3 killed. Image from Dronestagram by James Bridle

The images of deserted, barren landscapes and abandoned buildings have a sobering potency juxtaposed with with the banal pictures of pets and parties that populate Instagram. But it is what we don't see that gives these images such an emotional power. The mortality.

Bridle writes, "drones are just the latest in a long line of military technologies augmenting the process of death-dealing, but they are among the most efficient, the most distancing, the most invisible. These qualities allow them to do what they do unseen [...]. Whether you think these killings are immoral or not, most of them are by any international standard illegal."

The work of artists such as Trevor Paglen, Omer Fast, and James Bridle exists within a long tradition of artists bearing witness to events that our governments and military would prefer we didn't see.

But Bridle's work is also part of an ongoing collective effort from both artists and engineers to reveal the technological infrastructures that enable events like drone-strikes to occur. As technology becomes more ubiquitous, and our relationship with our devices becomes ever more seamless, our technical infrastructure is becoming ever more invisible. When our environment becomes opaque or invisible, it becomes difficult to interpret it, and act within it.  As artist and critical engineer, Julian Oliver recently noted, "our inability to describe and understand technological infrastructure reduces our critical reach, leaving us both disempowered and, quite often, vulnerable."

Or as Bridle puts it, "those who cannot perceive the network cannot act effectively within it, and are powerless. The job, then, is to make such things visible."


Sources: