The new iPhone is ugly

I’ll be the first to admit that I’m a bit old-fashioned when it comes to phones. Everyone scoffs at my iPhone SE, but the truth is it’s the best phone Apple ever made — a beautiful, well designed object in just about every way. But damn is the iPhone 11 Pro ugly. And so are the newest phones from Samsung and Google, while we’re at it.

Let’s just get right to why the new iPhones are ugly, front and back. And sideways. We can start with the notch. Obviously it’s not new, but I thought maybe this would be some kind of generational anomaly that we’d all look back and laugh at in a year or two. Apparently it’s sticking around.

I know a lot of people have justified the notch to themselves in various ways — it technically means more raw screen space, it accommodates the carrier and battery icons, it’s necessary for unlocking the phone with your face.

Yeah, but it’s ugly.

If they removed the notch, literally no one would want the version with the notch, because it’s so plainly and universally undesirable. If Apple’s engineers could figure out a way to have no notch, they’d have done it by now, but they can’t and I bet they are extremely frustrated by that. They try to hide it with the special notch-camouflaging wallpaper whenever they can, which is as much as saying, “hey, we hate looking at it too.”

nonotch

You can forget for a few seconds. But in the back of your mind you know it’s there. Everyone knows.

It’s a prominent, ugly compromise (among several) necessitated by a feature no one asked for and people can’t seem to figure out if they even like or not. Notches are horrible and any time you see one, it means a designer cried themselves to sleep. To be fair that probably happens quite a bit. I grew up around designers and they can be pretty sensitive, like me.

I’m not a big fan of the rounded screen corners for a couple reasons, but I’ll let that go because I envision a future where it doesn’t matter. You remember how in Battlestar Galactica the corners were clipped off all the paper? We’re on our way.

Having the screen extend to the very edge of the device on the other hand isn’t exactly ugly, but it’s ugly in spirit. The whole front of the phone is an interface now, which would be fine if it could tell when you were gripping the screen for leverage and not to do something with it. As it is, every side and corner has some kind of dedicated gesture that you have to be wary of activating. It’s so bad people have literally invented a thing that sticks out from the back of your phone so you can hold it that way. Popsockets wouldn’t be necessary if you could safely hold your phone the way you’d hold any other object that shape.

 

iphone 11 pro

The back is ugly now, too. Man, is that camera bump bad. Bump is really the wrong word. It looks like the iPhone design team took a field trip to a maritime history museum, saw the deep sea diving helmets, and thought, Boom. That’s what we need. Portholes. To make our phone look like it could descend to 4,000 fathoms. Those helmets are actually really cool looking when they’re big and made of strong, weathered brass. Not on a thin, fragile piece of electronics. Here it’s just a huge, chunky combination of soft squares and weirdly arranged circles — five of them! — that completely take over the otherwise featureless rear side of the phone.

The back of the SE is designed to mirror the front, with a corresponding top and bottom “bezel.” In the best looking SE (mine) the black top bezel almost completely hides the existence of the camera (unfortunately there’s a visible flash unit); it makes the object more like an unbroken solid, its picture-taking abilities more magical. The camera is completely flush with the surface of the back, which is itself completely flush except for texture changes.

The back of the iPhone 11 Pro has a broad plain, upon which sits the slightly higher plateau of the camera assembly. Above that rise the three different little camera volcanoes, and above each of those the little calderas of the lenses. And below them the sunken well of the microphone. Five different height levels, producing a dozen different heights and edges! Admittedly the elevations aren’t so high, but still.

hero gallery color story m6kjl7t4boqm large

If it was a dedicated camera or another device that by design needed and used peaks and valleys for grip or eyes-free navigation, that would be one thing. But the iPhone is meant to be smooth, beautiful, have a nice handfeel. With this topographic map of Hawaii on the back? Have fun cleaning out the grime from in between the volcanoes, then knocking the edge of the lens against a table as you slide the phone into your hand.

Plus it’s ugly.

The sides of the phones aren’t as bad as the front and back, but we’ve lost a lot since the days of the SE. The geometric simplicity of the + and – buttons, the hard chamfered edge that gave you a sure grip, the black belts that boldly divided the sides into two strips and two bows. And amazingly, due to being made of actual metal, the more drops an SE survives, the cooler it looks.

The sides of the new iPhones look like bumpers from cheap model cars. They look like elongated jelly beans, with smaller jelly beans stuck on that you’re supposed to touch. Gross.

That’s probably enough about Apple. They forgot about good design a long time ago, but the latest phones were too ugly not to call out.

Samsung has a lot of the same problems as Apple. Everyone has to have an “edge to edge” display now, and the Galaxy S10 is no exception. But it doesn’t really go to the edge, does it? There’s a little bezel on the top and bottom, but the bottom one is a little bigger. I suppose it reveals the depths of my neurosis to say so, but that would never stop bugging me if I had one. If it was a lot bigger, like HTC’s old “chins,” I’d take it as a deliberate design feature, but just a little bigger? That just means they couldn’t make one small enough.

sung 10

As for the display slipping over the edges, it’s cool looking in product photos, but I’ve never found it attractive in real life. What’s the point? And then from anywhere other than straight on, it makes it look more lopsided, or like you’re missing something on the far side.

Meanwhile it not only has bezels and sometime curves, but a hole punched out of the front. Oh my god!

Here’s the thing about a notch. When you realize as a phone designer that you’re going to have to take over a big piece of the front, you also look at what part of the screen it leaves untouched. In Apple’s case it’s the little horns on either side — great, you can at least put the status info there. There might have been a little bit left above the front camera and Face ID stuff, but what can you do with a handful of vertical pixels? Nothing. It’ll just be a distraction. Usually there was nothing interesting in the middle anyway. So you just cut it all out and go full notch.

Samsung on the other hand decided to put the camera in the top right, and keep a worthless little rind of screen all around it. What good is that part of the display now? It’s too small to show anything useful, yet the hole is too big to ignore while you’re watching full-screen content. If their aim was to make something smaller and yet even more disruptive than a notch, mission accomplished. It’s ugly on all the S10s, but the big wide notch-hole combo on the S10 5G 6.7″ phablet is the ugliest.

galaxy s10 camera

The decision to put all the rear cameras in a long window, like the press box at a hockey game, is a bold one. There’s really not much you can do to hide 3 giant lenses, a flash, and that other thing. Might as well put them front and center, set off with a black background and chrome rim straight out of 2009. Looks like something you’d get pointed at you at the airport. At least the scale matches the big wide “SAMSUNG” on the back. Bold — but ugly.

Google’s Pixel 4 isn’t as bad, but it’s got its share of ugly. I don’t need to spend too much time on it, though, because it’s a lot of the same, except in pumpkin orange for Halloween season. I like the color orange generally, but I’m not sure about this one. Looks like a seasonal special phone you pick up in a blister pack from the clearance shelf at Target, the week before Black Friday — two for $99, on some cut-rate MVNO. Maybe it’s better in person, but I’d be afraid some kid would take a bite out of my phone thinking it’s a creamsicle.

pixel 4

The lopsided bezels on the front are worse than the Samsung’s, but at least it looks deliberate. Like they wanted to imply their phone is smart so they gave it a really prominent forehead.

 

I will say that of the huge, ugly camera assemblies, the Pixel’s is the best. It’s more subtle, like being slapped in the face instead of kicked in the shins so hard you die. And the diamond pattern is more attractive for sure. Given the square (ish) base, I’m surprised someone on the team at Google had the rather unorthodox idea to rotate the cameras 45 degrees. Technically it produces more wasted space, but it looks better than four circles making a square inside a bigger, round square.

And it looks a hell of a lot better than three circles in a triangle, with two smaller circles just kind of hanging out there, inside a bigger, round square. That iPhone is ugly!

AI is helping scholars restore ancient Greek texts on stone tablets

Machine learning and AI may be deployed on such grand tasks as finding exoplanets and creating photorealistic people, but the same techniques also have some surprising applications in academia: DeepMind has created an AI system that helps scholars understand and recreate fragmentary ancient Greek texts on broken stone tablets.

These clay, stone or metal tablets, inscribed as much as 2,700 years ago, are invaluable primary sources for history, literature and anthropology. They’re covered in letters, naturally, but often the millennia have not been kind and there are not just cracks and chips but entire missing pieces that may comprise many symbols.

Such gaps, or lacunae, are sometimes easy to complete: If I wrote “the sp_der caught the fl_,” anyone can tell you that it’s actually “the spider caught the fly.” But what if it were missing many more letters, and in a dead language, to boot? Not so easy to fill in the gaps.

Doing so is a science (and art) called epigraphy, and it involves both intuitive understanding of these texts and others to add context; one can make an educated guess at what was once written based on what has survived elsewhere. But it’s painstaking and difficult work — which is why we give it to grad students, the poor things.

Coming to their rescue is a new system created by DeepMind researchers that they call Pythia, after the oracle at Delphi who translated the divine word of Apollo for the benefit of mortals.

The team first created a “nontrivial” pipeline to convert the world’s largest digital collection of ancient Greek inscriptions into text that a machine learning system could understand. From there it was just a matter of creating an algorithm that accurately guesses sequences of letters — just like you did for the spider and the fly.

PhD students and Pythia were both given ground-truth texts with artificially excised portions. The students got the text right about 57% of the time — which isn’t bad, as restoration of texts is a long and iterative process. Pythia got it right… well, 30% of the time.

But! The correct answer was in its top 20 answers 73% of the time. Admittedly that might not sound so impressive, but you try it and see if you can get it in 20.

greek process

The truth is the system isn’t good enough to do this work on its own, but it doesn’t need to. It’s based on the efforts of humans (how else could it be trained on what’s in those gaps?) and it will augment them, not replace them.

Pythia’s suggestions may not be perfectly right on the first try very often, but it could easily help someone struggling with a tricky lacuna by giving them some options to work from. Taking a bit of the cognitive load off these folks may lead to increases in speed and accuracy in taking on remaining unrestored texts.

The paper describing Pythia is available to read here, and some of the software they developed to create it is in this GitHub repository.

Swarm gets green light from FCC for its 150-satellite constellation

Swarm Technologies aims to connect smart devices around the world with a low-bandwidth but ever-present network provided by satellites — and it just got approval from the FCC to do so. Apparently the agency is no longer worried that Swarm’s sandwich-sized satellites are too small to be tracked.

The company’s SpaceBEE satellites are tiny things that will provide a connection to devices that might otherwise be a pain to get online. Think soil monitors in the middle of corn fields, or buoys in the middle of the ocean. Their signals don’t need low latency or high bandwidth — so the requirements for a satellite that serves them are much lower than for consumer broadband.

Consequently Swarm’s satellites are small — so small in fact that the FCC was worried that they would be difficult to track and might be a danger to other satellites. Part of the company’s responsibility in its application was to show that isn’t the case.

The FCC approval is just one step in the long process of getting approved to go to space for commercial operations, but it’s a big one. In addition to granting Swarm permission to send up its planned 150 satellites (and up to 600 if it decides to spread out a little), the FCC assigned Swarm the wireless spectrum it needs to operate. No use being in space if you’re forbidden from transmitting on the frequencies you need, right?

Longtime satellite communications provider ORBCOMM had objected that Swarm would be taking over some parts of the spectrum it has been assigned — but the FCC found that wasn’t actually the case and in fact the company was in a way making a sort of power play that would have extended its control over those frequencies. So their concerns were dismissed.

SpaceX also filed a comment suggesting that Swarm had not adequately considered its orbital debris footprint, neglecting in particular to include its satellites’ antennas in various calculations. It also said the satellites might be a risk to the International Space Station. But documents filed by Swarm addressing these questions seem to have satisfied the FCC completely — “We find that Swarm has taken the appropriate steps to address SpaceX’s concerns,” and it granted the application with the condition that the company abide by any upcoming orbital debris rules.

Swarm has clearly moved well past the black mark on its FCC record when it launched test satellites without the proper approvals. The red tape involved in space operations is voluminous and it’s not uncommon to fall afoul of it — especially when your competitors, as evidenced by the above, are making more of it for you.

Now that it has its paperwork in order, Swarm plans to get its entire constellation in orbit by the end of the year.

“The FCC grant of Swarm’s spectrum and launch approvals is a big milestone for the company. Swarm is now poised to be first to market for an entire global satellite data communications constellation before the end of 2020,” said CEO and co-founder Sara Spangelo in a statement to TechCrunch.

“This is an important moment for the satellite industry, for US innovation in space, and for the large number of IoT customers world-wide that Swarm is excited to support with 2-way data services,” added CTO and co-founder Ben Longmire.

Both Sara and Ben were at TechCrunch Disrupt earlier this month, and the former sat on a panel with Bessemer Venture Partners’ Tess Hatch and OneWeb CEO Adrian Steckel (with myself as moderator). We chatted about a variety of topics relating to the new space economy — if you’re thinking of getting up there yourself, you might be interested to watch it below.

Every state but Alaska has reported vape lung victims, now numbering 1,479 nationwide

A lung condition apparently caused by vaping has been reported in every state but Alaska, the CDC has announced. The total number of suspected and confirmed cases has risen to 1,479, and at least 33 people have died as a result of the affliction.

The CDC (Centers for Disease Control and Prevention) updates these numbers regularly and provides news on its progress in characterizing the condition, in which the only reliable shared factor is using vaping devices. More victims report using THC products than nicotine, but no specific chemical or mechanism has been proposed as the cause.

At the outset it appeared that the problem might have been rooted in a bad batch of unofficial vape cartridges tinged with some toxic chemical — and indeed the CDC has warned against buying vaping materials from any untrustworthy sources. But the scale of the problem has continuously grown and is now clearly nationwide, not local.

The demographics skew male (70 percent of victims) and younger: 79 percent are under 35, with a median of 23 (though for deaths the median is 44). 78 percent reported using THC products, while only 10 percent reported only using nicotine.

Reported victims are concentrated in Illinois and California, in both of which over a hundred cases have been reported, but that should not be taken as an indicator that states with fewer cases, like Kentucky and Oregon, are immune — they may simply be late to report. Likewise for U.S. territories, where only the Virgin Islands have reported cases — Puerto Rico and others are likely to be equally at risk.

lung cases oct15

If you use a vaping device and are experiencing shortness of breath or chest pain (though other symptoms are also associated), you should probably check with your doctor. In the meantime the CDC has recommended ceasing all use of vaping products, though as many have pointed out that may end up pushing some users back to cigarettes. Angry about that? Direct it at the vaping companies, which promoted their products as smoking cessation tools without adequate testing.

The CDC and FDA, along with state and municipal health authorities and partners, are working on determining the cause and any potential treatments of the “lung injury associated with e-cigarette use,” as they call it. Tests and sampling efforts are underway — efforts that probably should have been done before these products were allowed on the market.

You can keep up with the latest stats at the CDC’s dedicated page.

Microsoft accessibility grants go out to companies aiming to improve tech for the disabled

The tech world has a lot to offer those with disabilities, but it can be hard to get investors excited about the accessibility space. That’s why Microsoft’s AI for Accessibility grants are so welcome: equity-free Azure credits and cash for companies looking to adapt AI to the needs of those with disabilities. The company just announced ten more, including education for the blind startup ObjectiveEd.

The grant program was started a while back with a $5 million, 5-year mission to pump a little money into deserving startups and projects — and get them familiar with Microsoft’s cloud infrastructure, of course.

Applications are perennially accepted, and “anybody who wants to explore the value of AI and machine learning for people with disabilities is welcome to apply,” said Microsoft’s Mary Bellard. As long as they have “great ideas and roots in the disability community.”

Among the grantees this time around is ObjectiveEd, which I wrote about earlier this year. The company is working on an iPad-based elementary school curriculum for blind and low-vision students that’s also accessible to sighted kids and easy for teachers to deploy.

Part of that, as you might guess, is braille. But there aren’t nearly enough teachers capable of teaching braille as students who need to learn it, and the most common technique is very hands-on: a student reads braille (on a hardware braille display) out loud and a teacher corrects them. Depending on whether a student has access to the expensive braille display and a suitable tutor at home, that can mean as little as an hour a week dedicated to these crucial lessons.

ObjectiveEd 2

A refreshable braille display for use with apps like ObjectiveEd’s.

“We thought, wouldn’t it be cool if we could send a sentence to the braille display, have the student speak the words out loud, then have Microsoft’s Azure Services translate that to text and compare that to the braille display, then correct the student if necessary and move on. All within the context of a game, to make it fun,” said ObjectiveEd founder Marty Schultz.

And that’s just what the company’s next app does. Speech-to-text accuracy is high enough now that it can be used for a variety of educational and accessibility purposes, so all it will take for a student to get some extra time in on their braille lessons is an iPad and braille display — admittedly more than a thousand dollars worth of hardware, but no ever one said being blind was cheap.

Braille literacy is dropping, and, I suggested, no surprise there: With pervasive and effective audio interfaces, audio books, and screen readers, there are fewer times when blind and low-vision people truly need braille. But as Schulz and Bellard both pointed out, it’s great to be able to rely on audio for media consumption, but for serious engagement with the written word and many educational purposes, braille is either necessary or a very useful alternative to speech.

Both Schultz and Bellard noted that they are not trying to replace teachers at all — “Teachers teach, we help kids practice,” Schultz said. “We’re not experts in teaching, but we can follow their advice to make these tools useful to students.”

There are ten other grantees in this round of Microsoft’s program, covering a wide variety of approaches and technologies. I like the SmartEar, for instance, which listens for things like doorbells or alarms and alerts deaf people of them via their smartphone.

And City University of London has a great idea in personalizing object recognition. It’s pretty straightforward for a computer vision system to recognize a mug or keychain on a table. But for a blind person it’s more useful if a system can identify their mug or keychain, and then perhaps say, it’s on the brown table left of the door, or what have you.

Here are the ten grantees besides ObjectiveEd (descriptions provided by Microsoft, as I wasn’t able to investigate each one, but may in the future):

  • AbiliTrek : A platform for the disability community to rate and review the accessibility of any establishment, with the ability to tailor search results to the specific needs of any individual.
  • Azur Tech Concept – SmartEar : A service that actively listens for environmental sounds (i.e. doorbell, fire alarm, phone call) and retransmits them in colored flashes on small portable boxes or a smart phone to support the deaf community.
  • Balance for Autism – Financial Accessibility: An interactive program which provides information and activities designed to better match people with programs and services
  • City University of London – The ORBIT : Developing a data set to train AI systems for personalizing object recognition, which is becoming increasingly important for tools used by the blind community.
  • Communote – BeatCaps : A new form of transcription that uses beat tracking to generate subtitles that visualize the rhythm of music. These visualizations allow the hard of hearing to experience music.
  • Filmgsindl GmbH – EVE: A system that recognizes speech and generates automatic live subtitles for people with a hearing disability.
  • Humanistic Co-Design : A cooperative of individuals, organizations and institutions working together to increase awareness about how designers, makers, and engineers can apply their skills in collaboration with people who have disabilities.
  • iMerciv –  MapinHood : A Toronto-based startup developing a navigation app for pedestrians who are blind or have low vision and want to choose the routes they take if they’re walking to work, or to any other destination.
  • inABLE and I-Stem – I-Assistant: A serves that uses text-to-speech, speech recognition, and AI to give students a more interactive and conversational alternative to in-person testing in the classroom.
  • Open University – ADMINS : A chatbot that provides administrative support for people with disabilities who have difficulty filling out online academic forms.

The grants will take the form of Azure credits and/or cash for immediate needs like user studies and keeping the lights on. If you’re working on something you think might be a good match for this program, you can apply for it right here.

Feast your eyes on the first interstellar comet ever directly observed

The solar system has another interstellar visitor, but there’s no question of this one being an alien spacecraft. It’s a true comet and the first we’ve ever confirmed that comes from interstellar space, and the Hubble Space Telescope captured some amazing imagery of it. Good thing, too — because it’s never coming back.

You probably remember ‘Oumuamua as the interstellar object that launched a thousand headlines — mostly around the idea that it could be an alien ship of some kind. Needless to say that hypothesis didn’t really pan out, but honestly the object was interesting enough without being an emissary from another world.

This new comet, called 2I/Borisov (not as catchy), was first identified in August by an amateur astronomer named Gennady Borisov, who lives in Crimea. Studies by other near-Earth object authorities observed its trajectory and concluded that it did indeed come from interstellar space.

How do they know? Well, for one thing, it’s going 110,000 miles per hour, or 177,000 kph. “It’s traveling so fast it almost doesn’t care that the Sun is there,” said UCLA’s David Jewitt, who leads the Hubble team watching 2I/Borisov. (Note that in the gif above, the streaks don’t indicate its speed — those are from the Earth spinning.)

stsci h p1953a f 1106x1106

Basically the angle it’s coming in, plus the speed at which it’s traveling, mean it can’t possibly be in even a super-wide orbit of the Sun. It’s just passing through — and in early December will be less than 200 million miles from the Sun. It’s not on track to hit anything, fortunately, which would be a truly cosmic coincidence, so in a couple months it’ll be gone again.

But its short visit is ample opportunity to study its makeup, which appears to be very similar to our own “local” comets. Although it would be cool for 2I/Borisov to be super weird, its similarity is interesting in itself — it suggests that comet formation in other solar systems is not necessarily different.

trajectory

2I/Borisov is passing through the ecliptic at a pretty steep angle and traveling at great speed, more or less ruling out the idea that it’s orbiting the Sun.

It is, however, very different from ‘Oumuamua, which appeared to be an inert, oblong rock. Interesting in its own way, but comets are so dynamic: clouds of dust and ice surrounding a much smaller core. Very picturesque, even if the tails don’t always point the way you think they should.

Note that these interstellar visitors are actually thought to be quite common, with perhaps thousands in the solar system at any given moment. But few are big and bright enough to be detected and studied.

Hubble will continue observing 2I/Borisov through January and perhaps beyond. If it’s never going to return, we want to gather as much data as possible while we can.

The FrankOne is a simple and portable coffee brewing gadget

The FrankOne coffee maker, fresh off a successful crowdfunding campaign, is now available for purchase, and I got a chance to test out one of the first run of these funky little gadgets. While it won’t replace my normal pourover or a larger coffee machine, it’s a clever, quick and portable way to make a cup.

Designer Eduardo Umaña pitched me the device a little more than a year ago, and I was taken by the possibility of vacuum brewing — and the fact that, amazingly, until now no one from Colombia had made a coffee maker (it’s named after Frank de Paula Santander, who kicked off the coffee trade there). But would the thing actually work?

In a word, yes. I’ve tested the FrankOne a few times in my home, and, while I have a couple reservations, it’s a coffee making device that I can see myself actually using in a number of circumstances.

PA150003

The device works quite simply. Ground coffee goes in the top, and then you pour in the hot (not boiling!) water and stir it a bit — 30-50 seconds later, depending on how you like it, you hit the button and a pump draws the liquid down through a mesh filter and into the carafe below. It’s quick and almost impossible to mess up.

The resulting coffee is good — a little bit light, I’d say, but you can adjust the body with the size of the grounds and the steeping time. I tend to find a small amount of sediment at the bottom, but less than you’d get in a cup of French press.

Because it’s battery powered (it should last for ~200 cups and is easily recharged) and totally waterproof, cleaning it is a snap, especially if you have a garbage disposal. Just dump it and rinse it, give it a quick wipe and it’s good to go. It gets a bit more fussy if you don’t have a disposal, but what doesn’t?

PA150010

I can see this being a nice way to quickly and simply make coffee while camping — I usually do a French press, but sometimes drip, and both have their qualities and limitations. The FrankOne would be for making a single cup when I don’t want to have to stand by the pourover cone or deal with disassembling the French press for cleaning.

It’s also, I am told by Umaña, great for cold brew. I didn’t have the heart to tell him that I don’t really like cold brew, but I know many do, and Umaña promises the FrankOne works wonders in a very short time — four minutes rather than an hour. I haven’t tested that, because cold brew tastes like bitter chocolate milk to me, but I sincerely doubt he would mention it as many times as he did if it didn’t do what he said.

There are, I feel, three downsides. First, you’re pretty much stuck with using the included glass carafe, because the device has to create a seal around the edge with its silicone ring. It didn’t fit in my biggest mug, but you might find an alternative should the carafe (which I have no complaints about — it’s attractive and sturdy) crack or get lost.

PA150017

Second, it doesn’t produce a lot of coffee. The top line as indicated in the reservoir is probably about 10-12 ounces — about the size of a “tall” at a coffee shop. Usually that’s a perfect amount for me, but it definitely means this is a single-serving device, not for making a pot to share.

And third, for the amount of coffee it produces, I feel like it uses a lot of grounds. Not a crazy amount, but maybe 1.5-2x what goes into my little Kalita dripper — which is admittedly pretty economical. But it’s just something to be aware of. Maybe I’m using too much, though.

I reviewed the Geesaa a little while back, and while it’s a cool device, it was really complex and takes up a lot of space. If I wanted to give it to a friend I’d have to make them download the app, teach them about what I’d learned worked best, share my “recipes” and so on. There was basically a whole social network attached to that thing.

This is much, much easier to use — and compact, to boot. It’s a good alternative to classic methods that doesn’t try to be more than a coffee maker. At $120 it’s a bit expensive, but hey, maybe you spend that on coffee in a month.

And by the way, you can use the discount code “TC” at checkout to get 10% off — this isn’t a paid post or anything, Umaña’s just a nice guy!

Leo Labs and its high-fidelity space radar track orbital debris better than ever — from New Zealand

Ask anyone in the space business and they’ll tell you that orbital debris is a serious problem that will only get worse, but dealing with it is as much an opportunity as it is a problem. Leo Labs is building a global network of radar arrays that can track smaller debris than we can today, and with better precision — and the first of its new installations is about to start operations in New Zealand.

There are some 12,000 known debris objects in low Earth orbit, many of which are tracked by the U.S. Air Force and partners. But they only track debris down to 10 centimeters across — meaning in reality there may be hundreds of thousands of objects up there, just as potentially destructive to a satellite but totally unknown.

“Everyone’s flying blind and no one’s really talking about it,” said Leo Labs CEO Dan Ceperley. But his company hopes to change that with a set of advanced radars dedicated to the purpose, and to construct which the company raised $13 million last year.

“We’re extremely excited to show this New Zealand radar, because it’s the first instance of our next generation technology. We launched the company on the strength of this radar,” Ceperley said.

The installation uses what’s called a phased array radar, very different from the traditional big dishes one generally thinks of. The beam is electronically steered, letting it change targets in milliseconds or sweep the sky faster than any physically controlled dish could.

radar halfpipe

The phased array radar has no moving parts, the beam is steered from many identical small antennas electronically.

Not only that, but it can detect and track objects down to 2 centimeters across. They’re small, yes, but moving at thousands of miles per hour. Something the size of an M&M still hits hard enough to take out a satellite at that speed.

The ability to see objects of that size in orbit could increase the number tracked to a quarter of a million, Ceperley estimated. And with other radars able to track about a thousand objects per hour, they couldn’t possibly do the job even if they could draw a bead on them.

“A lot of these new satellites maneuver pretty frequently — so you want to be able to track them closely,” he said. “But if you have one radar, you can measure its orbit at one point, maybe every day or two, and of course on the far side of the Earth your coverage isn’t any good. With our radar network you’ll be able to check ten times a day.”

The increasingly common phenomenon of shared-ride launches with dozens of satellites on board presents a new opportunity. Ground based radars just aren’t designed to track 40 or 50 new objects in the sky all scooting off in different directions from the same spot. You might wait a week or more to be be able to ground-truth your satellite’s telemetry. Leo’s quick-acquisition, high-precision arrays are designed with this in mind, meaning trajectories and orbits can be verified in hours instead of days. That can be the difference between saving and losing a multi-million dollar investment.

The biggest player in this market is the U.S. Air Force, which has been the main tracking provider for years. But it relies on a hodgepodge of Cold War and newer tech, and because it’s military it’s limited in the type of information it can provide. Powerful radars are out there, but they’re often restricted by government contracts and cost hundreds of millions or more. And there are no good tracking stations in the Southern hemisphere. Leo Labs aims to pick up where the competition leaves off.

“We’re happy to announce that construction is complete on the New Zealand radar and we’re getting data out of it,” Ceperley said.

This first array will soon (after some testing but before the end o the year) join another in Texas and soon others around the world in producing data for Leo Labs’ SaaS platform — yes, it’s orbital debris tracking as a service, with a web portal and everything.

“All that intel goes into the second part of the company, a bunch of software in the cloud where the data gets analyzed,” Ceperley said. “We look for risky situations like satellites starting to tumble, potential collisions, et cetera. We send out alerts through a RESTful API, we have a dashboard with 3D visualizations, tables and maps, all that stuff. In the past there were no SaaS services for tracking satellites in flight. Governments can spend a decade and a billion dollars building a radar, but these new space companies can’t — so we thought that was a huge opportunity for us.”

leo gif

You can see a visualization of what it all looks like here — obviously it’s not to scale, but space is getting crowded, isn’t it?

Already they have plenty of supporters and subscribers: Planet, Digital Globe, Black Sky and the Air Force Research Lab are all sold. Swarm Technologies, whose satellites are so small that existing radar solutions barely cut it, was a natural customer. In fact Swarm founder Sara Spangelo just recently emphasized the importance of tracking space debris in a panel I moderated at Disrupt SF.

The company was spun out of SRI in 2016, its founding team experienced in building radars and doing debris tracking, and apparently just in time. The orbital economy is heating up and the infrastructure to support it is starting to creak.

Leo Labs and its high-fidelity space radar track orbital debris better than ever — from New Zealand

Ask anyone in the space business and they’ll tell you that orbital debris is a serious problem that will only get worse, but dealing with it is as much an opportunity as it is a problem. Leo Labs is building a global network of radar arrays that can track smaller debris than we can today, and with better precision — and the first of its new installations is about to start operations in New Zealand.

There are some 12,000 known debris objects in low Earth orbit, many of which are tracked by the U.S. Air Force and partners. But they only track debris down to 10 centimeters across — meaning in reality there may be hundreds of thousands of objects up there, just as potentially destructive to a satellite but totally unknown.

“Everyone’s flying blind and no one’s really talking about it,” said Leo Labs CEO Dan Ceperley. But his company hopes to change that with a set of advanced radars dedicated to the purpose, and to construct which the company raised $13 million last year.

“We’re extremely excited to show this New Zealand radar, because it’s the first instance of our next generation technology. We launched the company on the strength of this radar,” Ceperley said.

The installation uses what’s called a phased array radar, very different from the traditional big dishes one generally thinks of. The beam is electronically steered, letting it change targets in milliseconds or sweep the sky faster than any physically controlled dish could.

radar halfpipe

The phased array radar has no moving parts, the beam is steered from many identical small antennas electronically.

Not only that, but it can detect and track objects down to 2 centimeters across. They’re small, yes, but moving at thousands of miles per hour. Something the size of an M&M still hits hard enough to take out a satellite at that speed.

The ability to see objects of that size in orbit could increase the number tracked to a quarter of a million, Ceperley estimated. And with other radars able to track about a thousand objects per hour, they couldn’t possibly do the job even if they could draw a bead on them.

“A lot of these new satellites maneuver pretty frequently — so you want to be able to track them closely,” he said. “But if you have one radar, you can measure its orbit at one point, maybe every day or two, and of course on the far side of the Earth your coverage isn’t any good. With our radar network you’ll be able to check ten times a day.”

The increasingly common phenomenon of shared-ride launches with dozens of satellites on board presents a new opportunity. Ground based radars just aren’t designed to track 40 or 50 new objects in the sky all scooting off in different directions from the same spot. You might wait a week or more to be be able to ground-truth your satellite’s telemetry. Leo’s quick-acquisition, high-precision arrays are designed with this in mind, meaning trajectories and orbits can be verified in hours instead of days. That can be the difference between saving and losing a multi-million dollar investment.

The biggest player in this market is the U.S. Air Force, which has been the main tracking provider for years. But it relies on a hodgepodge of Cold War and newer tech, and because it’s military it’s limited in the type of information it can provide. Powerful radars are out there, but they’re often restricted by government contracts and cost hundreds of millions or more. And there are no good tracking stations in the Southern hemisphere. Leo Labs aims to pick up where the competition leaves off.

“We’re happy to announce that construction is complete on the New Zealand radar and we’re getting data out of it,” Ceperley said.

This first array will soon (after some testing but before the end o the year) join another in Texas and soon others around the world in producing data for Leo Labs’ SaaS platform — yes, it’s orbital debris tracking as a service, with a web portal and everything.

“All that intel goes into the second part of the company, a bunch of software in the cloud where the data gets analyzed,” Ceperley said. “We look for risky situations like satellites starting to tumble, potential collisions, et cetera. We send out alerts through a RESTful API, we have a dashboard with 3D visualizations, tables and maps, all that stuff. In the past there were no SaaS services for tracking satellites in flight. Governments can spend a decade and a billion dollars building a radar, but these new space companies can’t — so we thought that was a huge opportunity for us.”

leo gif

You can see a visualization of what it all looks like here — obviously it’s not to scale, but space is getting crowded, isn’t it?

Already they have plenty of supporters and subscribers: Planet, Digital Globe, Black Sky and the Air Force Research Lab are all sold. Swarm Technologies, whose satellites are so small that existing radar solutions barely cut it, was a natural customer. In fact Swarm founder Sara Spangelo just recently emphasized the importance of tracking space debris in a panel I moderated at Disrupt SF.

The company was spun out of SRI in 2016, its founding team experienced in building radars and doing debris tracking, and apparently just in time. The orbital economy is heating up and the infrastructure to support it is starting to creak.

NASA’s new Moon-bound spacesuit is safer, smarter and much more comfortable

The next Americans to set foot on the Moon will do so in a brand new spacesuit that’s based on, but hugely improved from, the original Apollo suits that last went up there in the ’70s. With easier entry, better mobility and improved communications, these won’t be nearly as clumsy or restrictive — though you still wouldn’t want to wear one around the house.

The new spacesuit, known as the Exploration Extravehicular Mobility Unit or xEMU, is still deep in development, but its features have been more or less finalized. It’s already being tested underwater, and orbital testing is scheduled for 2023.

Rather than build something completely new from the ground up, NASA engineers decided to address the (sometimes literal) pain points of a previous, proven design. As such, the new suit superficially resembles the ones in which we saw moonwalkers bunny-hopping around the lunar surface. But that’s because the basic design for a suit that protects you from hard vacuum and cosmic radiation is relatively straightforward.

In NASA’s words, a spacesuit is “a personalized spaceship that mimics all of the protections from the harsh environment of space and the basic resources that Earth and its atmosphere provide.” There’s only so much wiggle room there.

But while some parts may not have changed much since the old days, others are getting major improvements. First and foremost, both for safety and mission purposes, maneuverability has been upgraded in tons of ways.

xemu suit info

Infographic showing new and updated features of NASA’s new xEMU spacesuit

For one thing, there are altogether new joints and better ranges of motion for existing ones. The standard “astronaut stance” indicative of the inflexibility of the Apollo suits should be all but eliminated with the new freedoms afforded xEMU users. Not only will the normal range of motion be easier, but astronauts will be able to reach across their own torso or lift something clear over their head.

More flexible knees and “hiking-style” boots with flexible soles will make crouching and getting up much easier as well. It’s hard to believe we got this far without those basic capabilities.

xemu digitial fit check

A 3D scan of the body (indicated by the dots) shows how various suits and parts would fit

The fit of the suits will be vastly better as well; NASA is using anthropometry, or 3D scanning of the body, to determine exactly which pieces and fits will be best for a given astronaut.

Speaking of which, much of the suit will be made from easily swappable, modular parts. The lower half can be switched out when doing an orbital EVA versus a surface EVA, for instance. And the helmet’s visor has a “sacrificial” protective layer that can easily be replaced with a new one if it gets damaged.

Inside the helmet, the familiar but apparently widely disliked “Snoopy caps” that housed microphones and such are gone, replaced by modern voice-activated mics and headphones that will produce much better audio quality and much less sweat.

For that matter, the entire communications stack has been replaced with a new HD camera and lights, connected by a high-speed wireless data link. Live video from the Moon may be old hat, but it’s going to be a bit different from that grainy black-and-white business in 1969.

One of the most important new features is rear entry. The awkward process of donning an old-style EVA suit requires a good deal of space and help. The new ones are entered via a hatch on the back, allowing more natural placement of arm hinges and other features, and possibly changing how the suits are mounted. One can easily imagine a suit acting as a sort of airlock: you climb in the back, it seals you in, and you walk right out into space. Well, there’d probably be more to it than that, but the rear-entry hatch could facilitate some cool stuff along those lines.

Although NASA is designing and certifying these suits, it may not actually make them itself. The agency called last week for input on how it might best source spacesuits from the commercial space industry.

That’s part of NASA’s decision to rely increasingly on contractors and private industry to support its 2024 Moon ambitions. Of course, contractors were an essential part of the Apollo program as well, but NASA is now giving them much more leeway and may even use private launch services.

You can keep up with the latest NASA spacesuit news here, of course, or at the agency’s SuitUp tag.