In fact, New York beat out 24 other global cities as the top city for attracting and fostering women-owned businesses, according to a new report commissioned by Dell. The study, conducted in the spring of 2016, evaluated 50 cities around the world on access to capital, technology, workforce talent, culture and markets.
Australian physicists, perhaps searching for a way to shorten the work week, have created an AI that can run and even improve a complex physics experiment with little oversight. The research could eventually allow human scientists to focus on high-level problems and research design, leaving the nuts and bolts to a robotic lab assistant. Read More
Kayon Partners LLC has teamed with The LeoGroup LLC as sub-advisor to the private equity fund LeoGroup Private Investment Access, LLC . LeoGroup is expanding its offerings to ultra-affluent individuals through its multi-family office platform.
” Bill Campbell, a management guru to Steve Jobs and other Silicon Valley luminaries, has died after a long battle with cancer. He was 75. Although not widely known outside Silicon Valley, Campbell played a pivotal role in shaping the direction of both Apple and Google, two of the world’s most powerful companies.
When I met with Buzz Aldrin to discuss his new book No Dream Is Too High: Life Lessons From A Man Who Walked On The Moon, he described himself as possibly “among the luckiest guys.” After all, his mother was born in 1903, the same year that the Wright brothers made their first flight, and Aldrin himself was born less than three decades later. Yet in the span of his own life… Read More
John Carmack is a legend in video game programming, and he received a high award from the British Academy of Film and Television (BAFTA) on Friday. He gave a short acceptance speech and did an interview that should inspire the whole game industry to do greater things. We’ve posted the video below, and pulled a few choice quotes from it.
Carmack has been involved in a lot of pioneering work, from establishing the first-person shooter genre with Wolfenstein 3D and Doom to helping create the Samsung Gear VR virtual reality headset that debuted last fall. Carmack is currently chief technology officer at Oculus VR, which Facebook acquired for $2 billion in 2014, and he helped create the Oculus Rift headset that debuted for the PC on March 28.
Upon accepting the BAFTA Fellowship award onstage, Carmack said, “A strong team can take any crazy vision and turn it into reality.”
“Thank you very much for this honor, but I’m just getting started,” Carmack said.
In the after-interview, he said he felt uncomfortable about getting a lifetime achievement award because he was “only 45 years old, and I’ve got a whole lot more programming left in me.
“Obviously, virtual reality is where I’ve placed my bet about the future and where the excitement is going. At this point, I could say it’s almost a lock. It’s going to be magical, it is magical, and great things are coming from that. Along the way, I was focused on the first-person shooters. I said we should go do something on mobile. We were doing mobile games before the iPhone. We were doing free-to-play with Quake Live. We wanted to do massively multiplayer stuff in the early days but didn’t have the resources to do it.”
Carmack said it took people with the full weight of resources to flesh out those trends. He said the new generation of indie Steam games shows that you can make the magic of gameplay without the huge bet that triple-A (blockbuster) game teams are putting into development. He says it will take a little while before VR “hits on all of the things that become the magic of the medium.”
Mesosphere, a startup that sells a “data center operating system” drawing on open-source tools such as Apache Mesos, today announced a $73.5 million funding round. Hewlett Packard Enterprise led the round, with participation from Microsoft, according to a statement.
In addition to the new funding, Mesosphere is announcing the 1.0 release of the Marathon open-source software for managing containers on top of Mesos. Mesosphere is also announcing the launch of software called Velocity for continuous integration (CI) and continuous delivery (CD). This type of technology helps developers deploy new software builds on servers. The idea is to help companies quickly and easily make software tweaks to improve services and fix bugs. Velocity is based on Marathon and integrates with the open-source Jenkins CI tool.
“A large number of major companies, including eBay, already run massive CI/CD environments built on Mesos, Jenkins, Docker and other tools in order to speed up the process of pushing code live and making developers more agile,” Mesosphere said in the statement. “Velocity brings these same capabilities to every type of company.”
These new tools could help Mesosphere further distinguish itself from other companies — namely Docker and Mesosphere — that deploy applications in containers, which are lightweight alternatives to more traditional virtual machines.
Then again, the move could put Mesosphere in more direct competition with other container-oriented CI tool providers.
A Capital, Andreessen Horowitz, Fuel Capital, Khosla Ventures, and Triangle Peak Partners joined in the new funding round, alongside Hewlett Packard Enterprise and Microsoft.
Marathon is part of Microsoft’s recently launched Azure Container Service for deploying container-based applications on the Azure infrastructure, according to the statement.
Mesosphere started in 2013 and is based in San Francisco. The startup has more than three dozen customers, including Samsung, Verizon, and Yelp, a spokesperson told VentureBeat.
BMW was among the first auto manufacturers to introduce integration with iPhones in 2011. Now, five years later, the company has at last announced at the New York International Auto Show that BMW Apps will integrate with Android devices. The first three apps to work with the iDrive system in the 2016 BMW 7 Series are all about the music: iHeartRadio, Pandora and Spotify. So far, it’s… Read More
The developer preview of Android N, the upcoming version of Google’s Android mobile operating system, comes with a few new features that build on its predecessor, Android Marshmallow, right out of the box (and more new stuff will arrive in previews to follow). One highlight of this initial preview is a split-screen mode that works in both portrait and landscape orientations on smartphones and tablets.
Split-screen is particularly impressive on Google’s recently released Pixel C convertible tablet, which is one of just two tablets that can run this first developer preview. The other is the Nexus 9. Sure, split-screen does work on Android phones — I’ve been experimenting with it on a Nexus 5X — but on tablets it looks more impressive, and you can be more productive.
The change is important because it will give Android feature parity. Microsoft’s Surface, and any other convertible tablet that runs Windows, lets you split your screen like a champ with little tinkering, and for years Samsung has included Multi Window mode on its phones and tablets, and Apple just recently rolled out its take on multi-window functionality, but for just a single iOS device.
When you compare it with the Split View feature that came with the iPad Pro, the split-screen mode in Android N is more flexible.
You can adjust the slider between the two windows being shown on the display. Well, to a certain point — you can’t make one window take up less than about a third of the display. With the iPad Pro, Split View in landscape mode only lets you give windows an approximately 75/25 split or a 50/50 split. Split View in portrait on iPad Pro doesn’t even let you adjust the proportions; you’re stuck at 60/40. With that said, Android N is less flexible on a phone than on a tablet — that is, you can’t move the slider in landscape mode.
And unfortunately at the moment, not all apps work with Android N’s split-screen feature. For example Uber, Lyft, Snapchat, Instagram, and Minecraft: Pocket Edition don’t permit you to give anything less than the whole screen. The same problem hampers Split View on the iPad Pro.
The feature does have bugs. Some views in certain apps just don’t load right. But this is just one week into a developer preview. Google and third-party app developers should smooth out some of these kinks in the next few months, leading up to the official release of Android N.
But even with these shortcomings, the arrival of split-screen on Android N is a revelation about what you can get done on your Android device. Keep an eye on Twitter while you’re doing stuff in Chrome? Of course. Talking with your teammates on Slack while you’re dealing with email in Gmail? Sure. Talk with someone on Messenger while drawing stuff in Autodesk SketchBook? You got it. Adding a Trello card while you’re watching a YouTube video? Yup. On and on and on. App switching is just downright less time-consuming and error-prone when you can have two apps open at once.
On the Pixel C, this feature works especially well, because you can type fast on its sturdy keyboard and have the screen propped up at a comfortable angle. Tablet computing starts to feel like less of a novelty and more like desktop computing. And for an Android device that’s exciting.
It is difficult to explain virtual reality to someone who has never tried it before, but one of the most exciting developers working in the space might have solved the problem.
Studio Owlchemy Labs, which is making the VR game Job Simulator, has a new trailer in the works that mixes together video of a real-world player and the simulated office he’s interacting with. This creates a seamless image that enables us outsiders to get a third-person view on what the player is doing inside of the virtual world. Previously, audiences have had just a first-person view of the action, which doesn’t really convey a sense of presence. Companies like Sony have used computer-generated graphics to suggest what it’s like to play a game in VR. But those solutions always fail to accurately portray the sensation of existing in a simulation, and that’s a problem for a technology that is asking consumers to spend upward of $1,500. If this market is going to grow into a $40 billion business by 2020, it’s going to need to figure out how to demonstrate VR in an exciting and instantly recognizable way.
Owlchemy is onto something with the mixed-reality tests it’s doing with Job Simulator. In the short, 7 second clip below, you can see a man picking up things on a desk and throwing them, which is a big part of actually playing the game with the Vive.
Check it out:
This isn’t an animation or a mock-up. The video shows a person who is really playing Job Simulator and how his actions are affecting that world. He picks up the coffee cup and throws it in the air, catching it with his other hand. He throws a paper airplane across the room.
I’ve played Job Simulator, and it feels exactly like this. You get the sense that you’re really standing at a desk or in a kitchen despite the cartoon graphics, and you are just knocking things around with the controllers and throwing items around.
This isn’t to say that this is a perfect solution. I’m sure some people will see this and not understand why a real man is standing inside of an old video game office like a cheap 1990s version of Who Framed Roger Rabbit?, but this is the best 2D translation of the 3D VR experience I’ve seen yet. And it’s likely viewers will see a lot more of this as developers begin trying to market their Vive, Rift, and PSVR games.