Combining augmented reality, 3D printing and a robotic arm to prototype in real time



Robotic Modeling Assistant (RoMA) is a joint project out of MIT and Cornell that brings together a variety of different emerging technologies in an attempt to build a better prototyping machine.

Using an augmented reality headset and two controllers, the designer builds a 3D model using a CAD (computer-aided design) program. A robotic arm then goes to work constructing a skeletal model using a simple plastic depositing 3D printer mounted on its hand.

“With RoMA, users can integrate real-world constraints into a design rapidly, allowing them to create well-proportioned tangible artifacts,” according to team leader, Huaishu Peng. “Users can even directly design on and around an existing object, and extending the artifact by in-situ fabrication.”

A video uploaded by Peng shows that the system’s 3D printing is still pretty crude. Being mounted to the end of the arm, versus a more constrained 3D printer bed has a much looser effect on the print.

It is, however, a lot faster than most methods that use the familiar FDM method you’ll find in most desktop 3D printers, and as such could eventually be useful to those looking to essentially sketch things out in a three-dimension space with a bit more control than you’ll get on a 3D printing pen like the 3Doodler.

The arm is also programmed to react in real time to the designer’s actions. “At any time, the designer can touch the handle of the platform and rotate it to bring part of the model forward,” writes Peng. “The robotic arm will park away from the user automatically. If the designer steps away from the printing platform, the robotic fabricator can take the full control of the platform and finish the printing job.”



Source link

Update for iOS and Macs negates text bomb that crashed devices



Last week we reported a major bug in Apple operating systems that would cause them to crash from mere exposure to either of two specific Unicode symbols. Today Apple fixes this major text-handling issue with iOS version 11.2.6 and macOS version 10.13.3, both now available for download.

The issue, discovered by Aloha Browser in the course of normal development, has to do with poor handling of certain non-English characters. We replicated the behavior, basically an immediate hard crash, in a variety of apps on both iOS and macOS. The vulnerability is listed on MITRE under CVE-2018-4124. If you were curious.

Apple was informed of the bug and told TechCrunch last week that a fix was forthcoming — in fact, it was already fixed in a beta. But the production version patches just dropped in the last few minutes (iOS; macOS). Apple calls the magical characters a “maliciously crafted string” that led to “heap corruption.” It seems that macOS versions before 10.13.3 aren’t affected, so if you’re running an older OS, no worries.

The iOS patch also fixes “an issue where some third-party apps could fail to connect to external accessories,” which is welcome but unrelated to the text bomb.

You should be able to download both updates right now, and you should, or you’ll probably get pranked in the near future.

 



Source link

The $20 Wyze security camera gets a sequel with improved intelligence and Amazon Echo support



The team behind WyzeCam is full of surprises. First they introduced a $20 plug and play security camera in October of last year, and now they’re already back with the sequel. The simply titled WyzeCam v2 is here, less than half a year later, bringing with it some pretty welcome updates.

The first version of the camera actually scored some pretty decent reviews for all it was able to pack into such a low cost piece of hardware. The original did 1080p video with night vision and featured such security camera mainstays as motion and sound detection, sending alerts when it heard the beep of a smoke alarm going off.

Still priced at $20 through the company’s site, the v2 packs many of those same features and adds key updates like upgraded AI to help identify the objects it sends motion alerts for. There’s also some Alexa support on board here, according to Engadget.

It’s not the full deal with voice commands, but you will be able to view videos through the video screens on the Echo Spot and Show. That upgrade is set to roll out next month, along with IFTTT support. The hardware’s gotten a bit of an upgrade, as well, with an improved video sensor (though still 1080p max) and better audio capture.

The new version of the dirt cheap camera is available for pre-order now and should start shipping by the end of the month.



Source link

Algorithmic zoning could be the answer to cheaper housing and more equitable cities



Zoning codes are a century old, and the lifeblood of all major U.S. cities (except arguably Houston), determining what can be built where and what activities can take place in a neighborhood. Yet as their complexity has risen, academics are increasingly exploring whether their rule-based systems for rationalizing urban space could be replaced with dynamic systems based on blockchains, machine learning algorithms, and spatial data, potentially revolutionizing urban planning and development for the next one hundred years.

These visions of the future were inspired by my recent chats with Kent Larson and John Clippinger, a dynamic urban thinking duo who have made improving cities and urban governance their current career focus. Larson is a principal research scientist at the MIT Media Lab, where he directs the City Science Group, and Clippinger is a visiting researcher at the Human Dynamics Lab (also part of the Media Lab), as well as the founder of non-profit ID3.

One of the toughest challenges facing major U.S. cities is the price of housing, which has skyrocketed over the past few decades, placing incredible strain on the budget of young and old, singles and families alike. The average one-bedroom apartment is $3,400 in San Francisco, and $3,350 in New York City, making these meccas of innovation increasingly out-of-reach of even well-funded startup founders let alone artists or educators.

Housing is not enough to satiate the modern knowledge economy worker though. There is an expectation that any neighborhood is going to have a laundry list of amenities, from nice and cheap restaurants, open spaces, and cultural institutions to critical human services like grocery stores, dry cleaners, and hair salons.

Today, a zoning board would simply try to demand that various developments include the necessary amenities as part of the permitting process, leading to food deserts and the curious soullessness of some urban neighborhoods. In Larson and Clippinger’s world though, rules-based models would be thrown out for “dynamic, self-regulating systems” based around what might agnostically be called tokens.

Every neighborhood is made up of different types of people with different life goals. Larson explained that “We can model these different scenarios of who we want working here, and what kind of amenities we want, then that can be delineated mathematically as algorithms, and the incentives can be dynamic based on real-time data feeds.”

The idea is to first take datasets like mobility times, unit economics, amenities scores, and health outcomes, among many others and feed that into a machine learning model that is trying to maximize local resident happiness. Tokens would then be a currency to provide signals to the market of what things should be added to the community or removed to improve happiness.

A luxury apartment developer might have to pay tokens, particularly if the building didn’t offer any critical amenities, while another developer who converts their property to open space might be completely subsidized by tokens that had been previously paid into the system. “You don’t have to collapse the signals into a single price mechanism,” Clippinger said. Instead, with “feedback loops, you know that there are dynamic ranges you are trying to keep.”

Compare that systems-based approach to the complexity we have today. As architectural and urban planning tastes have changed and developers have discovered loopholes, city councils have updated the codes, and then updated the updates. New York City’s official zoning book is now 4,257 pages long (warning: 83MB PDF file), the point of which is to rationalize what a beautiful, functional city should look like. That complexity has bred a massive influence and lobbying industry as well as startups like Envelope which try to make sense of it all.

A systems-based approach would throw out the rules while still seeking positive end results. Larson and Clippinger want to go one step further though and integrate tokens into everything in a local neighborhood economy, including the acquisition of an apartment itself. In such a model, “you have a participation right,” Clippinger said. So for instance, a local public school teacher or a popular baker might have access to an apartment unit in a neighborhood without paying the same amount as a banker who doesn’t engage as much with neighbors.

“Wouldn’t it be great to create an alternative where instead of optimizing for financial benefits, we could optimize for social benefits, and cultural benefits, and environmental benefits,” Larson said. Pro-social behavior could be rewarded through the token system, ensuring that the people who made a neighborhood vibrant could remain part of it, while also offering newcomers a chance to get involved. Those tokens could also potentially be fungible across cities, so a participation right token to New York City might also give you access to neighborhoods in Europe or Asia.

Implementation of these sorts of systems is certainly not going to be easy. A few years ago on TechCrunch, Kim-Mai Cutler wrote a deeply-researched analysis of the complexity of these issues, including the permitting process, environmental reviews, community support and opposition, as well as the basic economics of construction that make housing and development one of the most intractable policy problems for municipal leaders.

That said, at least some cities have been excited to trial parts of these algorithmic-based models for urban planning, including Barcelona and several Korean cities according to the two researchers. At the heart of all of these experiments though is a belief that the old models are no longer sufficient for the needs of today’s citizens. “This is a radically different vision … it’s post-smart cities,” Clippinger said.

Featured Image: Nicky Loh/Bloomberg/Getty Images



Source link

Hulu’s weekend outage affected some users trying to watch Olympics, NBA All-Star game



Hulu is still recovering to a hit to its reputation due to a live streaming outage that affected some number of subscribers during the Super Bowl. But over the weekend, the company’s live TV service experienced another outage, this time leading to login and connection issues. The issues prevented users from watching other high-profile sporting events, including the Olympics and the NBA All-Star game, as well as other live TV programs and video-on-demand.

The company’s support Twitter account officially confirmed the problem on Saturday, February 17, 2018, saying the team was investigating the issue.

Shortly after confirming the problem was resolved, the account posted again to say that it was working on an issue affecting login.

Around an hour later, Hulu announced all affected services were fully restored.

But on Sunday night, February 18, 2018, Hulu tweeted again that some users were still experiencing login troubles.

This last issue wasn’t confirmed to be resolved until midnight on Monday, February 19, 2018.

That means for the majority of the weekend, some portion Hulu users were not able to use the service as expected. Needless to say, the affected subscribers were fairly upset about this, as indicated by their angry tweets.

Following every tweet from the @Hulu_Support account, is a stream of tweets from users who couldn’t log in, couldn’t stream, were experiencing lags, couldn’t start shows, and, in some cases had spent a lot of time troubleshooting the issues themselves to no avail. Many were also confused and upset because Hulu had tweeted the issues were resolved when they were not, as it turned out.

Some customers are even now demanding refunds for the month or a free month of service, like those affected by the Super Bowl outages received. (Hulu doesn’t have an official plan to dole out refunds at this time, as it did for Super Bowl, but its customer support team may give out refunds on a case-by-case basis for those who ask, we’ve heard.)

Hulu has declined to comment on the outage.

According to sources familiar with the situation that occurred over the weekend, the outages were a result of an issue at Hulu’s Las Vegas data center. The company had to migrate its services to another data center, and then later, migrate them back, leading to downtime for users. The root cause, however, is still under investigation.

This is a different problem from what had happened during the Super Bowl. The big game was interrupted for some due to a problem with Hulu’s system for extending live programming past its scheduled stop time.

That distinction may not matter much to Hulu users, however.

Unfortunately for Hulu, there’s little tolerance for technical issues with live TV services these days, given how many alternatives are now available. If one is not working well, subscribers can simply cancel and jump to another, like AT&T DirecTV Now, Dish’s Sling TV, Sony’s PlayStation Vue, fuboTV, Philo, or Google’s YouTube TV.

Even if the reported streaming issues don’t impact all users – Hulu said that only a “small percentage” were affected during the Super Bowl, for example (and this weekend’s outage may be even smaller) – consumers may still be concerned about the service’s stability because of what’s happening to other customers.

Meanwhile, YouTube TV has been capitalizing on Hulu’s connectivity issues to tout its own service’s reliability. When announcing its channel expansions and $5 per month price increase last week, for example, YouTube TV highlighted how stable is service is.

“We have, by a very wide margin, the most live local stations. And on top of that, it’s also about quality of signal and reliability of signal,” Heather Moosnick, Director of Content Partnerships at YouTube TV, had said in an interview with TechCrunch.

“Those are two things we’re really focused on. We want people to know that when they’re signing up to a live TV service, they’re going to get a live TV service….It’s something we’re very invested in and committed to. If it’s the first time you’ve signed up to a live TV service and you can’t watch the Super Bowl – we want to make sure that doesn’t happen to you,” she added, clearly referencing Hulu’s recent struggles.

Hulu had heavily promoted its custom interface for tracking the Olympics as a big differentiator from other streaming competitors. The company’s app allows users to follow their favorite events, and keep up with them from a personalized dashboard. But however great the feature is, it’s no good if the service itself doesn’t work.

Hulu is not the first to struggle with connectivity in the early days of its service. Both Sling TV and DirecTV Now also experienced issues in the past, and both have since grown to become the two largest, in terms of subscribers.

 



Source link

Nike teamed up with Snap and Darkstore to pre-release Air Jordan III ‘Tinker’ shoes on Snapchat



Snap, Nike, Darkstore and Shopify teamed up in a collaboration of epic proportions to pre-release the Air Jordan III “Tinker” on Snapchat with same-day delivery last night after the NBA All-Star game. This is the first time a brand other than Snap has sold a product via Snapchat.

The thousands who attended the Jumpman All-Star after-party in Los Angeles last night were able to scan exclusive Snap codes to receive the shoes by 10:30pm that same night. Once they scanned the Snap code, they were brought into the Snapchat app, where they could then purchase the sneakers.

Within 23 minutes, all the shoes sold out, Darkstore CEO Lee Hnetinka told me. Darkstore, a startup that aims to become an “invisible retailer,” facilitated the deliveries.

“This is the holy grail of the experience [Nike is] trying to intend, which is direct to consumer — to the actual consumer, versus a bot, — and same-day delivery,” Hnetinka said. “The Snap code introduces a new paradigm for commerce.”

Darkstore works by exploiting excess capacity in storage facilities, malls and bodegas, and enables them to be fulfillment centers with just a smartphone. The idea is that brands without local inventory can store products in a Darkstore and then ship them out the same day.

In addition to the exclusive Snap codes, Snapchat geofenced the area over the Staples Center in downtown Los Angeles during the All-Star game. Within that geofence, fans had access to a special 3D augmented reality Michael Jordan lens.

The official release for the shoe isn’t until March 24, but Nike wanted to do something extra special in celebration for the 30th anniversary of Michael Jordan’s slam dunk in 1988. That dunk is often referred to as the moment when Jordan “took flight.”

This isn’t Nike’s first time selling shoes via app-based experiences. Last June, Nike’s release for the SB Dunk High “Momofuku” required people to go a Momofuku restaurant, or to the Momofuku website, and then point their camera at the menu in order to see a sneaker pop up in augmented reality. From there, sneakerheads could purchase the shoes. Similar to what Nike is doing with Snapchat, you have to physically, or virtually, be somewhere in order to buy a pair.

  1. DS_NIKE_iphonex_mock_0

  2. DS_NIKE_iphonex_mock_1

  3. DS_NIKE_iphonex_mock_6

  4. DS_NIKE_iphonex_mock_7

This collaboration also marks Snap’s most aggressive move into the in-app e-commerce game. Snap launched the Snap Store within the Snapchat app’s Discover section just earlier this month to sell the Dancing Hot Dog Plushie, Snapchat winkface sweatshirt and other Snap-related products. At the time, TechCrunch’s Josh Constine noted Snapchat could position itself as a way for top brands to reach their audiences in a medium that bridges both shopping and social experiences.



Source link

Pebble founder Eric Migicovsky has joined Y Combinator as a partner



If you follow the startup industry, you likely know the story of smartwatch maker Pebble, including that famous Kickstarter campaign in 2012 that sought $100,000 but wound up raising more than $10 million instead. You might also remember thinking that Pebble’s fate was sealed once Apple launched its own now-ubiquitous smartwatch in 2014. You were right if so. By late 2016, Pebble was forced to sell its software and intellectual property to another wearable giant, Fitbit, for less than $40 million — an amount that reportedly barely covered Pebble’s debt.

What you probably don’t know was that behind the scenes, Pebble founder Eric Migicovsky was frequently seeking advice from Y Combinator. Pebble had passed through the accelerator’s program in the winter of 2011 and like many alums, Migicovsky had formed strong bonds with both his fellow founders and with YC execs, including its president, Sam Altman. “Seven years later . . . I was still phoning Sam at 11 p.m. to get help in that deal” to Fitbit, says Migicovsky with a laugh.

Now, Migicovsky will be sharing lessons learned with future YC startups, having quietly joined YC last month as one of its now 18 full-time partners. His role: to work with incoming teams, including those whose companies have a hardware component.

We talked with Migicovsky late last week about his wild ride to date and what he hopes to accomplish in his new role. Our chat has been edited for length.

TC: Your relationship with YC dates back some time.

EM: I was in Waterloo, Ontario [studying engineering and starting up Pebble] and [YC founders] Paul [Graham] and Jessica [Livingston] wound up investing and I ended up moving the company the California. YC was really the first [outfit] to believe in us. We did the winter 2011 batch and did our Kickstarter a year later  — before it was even a thing — because we couldn’t raise money. It was hard days for hardware back then.

YC played an amazing role through the sale to Fitbit, and after I sold the company, I took some time off, but because I’d been part of YC, I began working last summer as part-time partner. It was a great chance to start mentoring companies and to spend one-on-one time with the founders, and  [YC CEO] Michael [Seibel] and Sam said I should jump on board.

TC: You’ve now joined full-time. What is your role exactly?

EM: I’m definitely covering the hardware desk. About 15 percent of companies going through YC have some connection to hardware, be it enterprise hardware, software with an enterprise component . . . So I’ll be a main point of contact for many of those companies.

TC: Have you done any investing in the past?

EM: I’ve done some but I didn’t have much time outside of Pebble, so the opportunity to take some of the anecdotes and experiences I’ve gathered and help apply them at other companies is really exciting to me.

TC: What are some of the lessons learned that you’re likely to share with these startups?

EM: In the early days of a startup, it’s pedal to the metal; you’re doing what it takes to get going. But after it starts taking off, it’s important to keep in mind a vision of where the company is going in the long run. At Pebble, we had a great first shot with Kickstarter. We were able to capture people’s attention and imagination. We had this “activation” energy. But it didn’t carry us into the next stage. I didn’t have that longer term vision in mind necessarily when times got hard. I wasn’t thinking about the company’s world-changing mission, and that’s something I talk about with startups.

TC: So you think you could have prevented Pebble’s eventual outcome, despite the technical and marketing muscle of a competitor like Apple. Where do you think Pebble went wrong more specifically?

EM: We sold a quarter billion dollars worth of watches. We were in a good position. What we lost was that seed. We were a hacker product from the get-go. We were building a watch that anyone could program for, but under pressure from competitors that came in, we didn’t find the one thing to stake our claim on. We vacillated between fitness and productivity in trying to find our groove and scale to a larger audience, and we couldn’t find it in time. Meanwhile, others like Fitbit had done this really well. Fitbit was like, “We’re a health and wellness product.”

TC: YC has changed a lot since 2011. Do you find the scale at all daunting?

EM: It has changed a ton. When I went through in 2011, there were 40 companies in the batch and people wondered how YC could be [managing so many startups]. Now, it’s more than 150 companies in each batch. What’s amazing to me are the processes and software that Michael and Sam have built out to help the companies and to help the partners. For example, we have evaluation process software that helps us to manage office hours and mentoring at scale. It’s pretty awesome.

TC: Is it too soon to say whether it’s more awesome than running your own company? Would you want to launch another startup?

EM: I just started full time four weeks ago, but one of the things I’m loving, working at YC, is [interfacing] with hundreds of companies. I ran Pebble for nine years, and running a company is a very laser-focused operation. You need to be thinking every single minute about how you can help your employees and product. YC is a very different experience. I can dive deep into a particular company’s problem, provide a relevant story or connection, then jump back out and move on to the next company. It’s kind of like ADD for startups.

YC is announcing 11 other appointments today, including those of two visiting partners. You can learn more here

Featured Image: Ramin Talaie/Corbis/Getty Images



Source link

Molly wants to use your online presence to create an automated knowledge base



Isn’t it frustrating when you ask a friend a question – like what’s your favorite restaurant in New York or what trips have you been on this year – knowing that these specific answers are certainly already accessible on social media?

The problem is no one wants to spend an hour combing through their friends social media pages (or worse, monitoring them 24/7), so we just end up asking them directly.

Molly wants to fix this. The startup wants to make information more easily accessible, primarily by cross-pollinating information posted on your various online profiles and making it available in one central location. The startup was founded by Chris Messina, Esther Crawford and Ethan Sutin over the summer, and is now part of Y Combinator’s Winter ’18 batch.

Eventually this information can be made available via an Alexa skill or a chatbot, so you could theoretically ask “I want to have dinner with Kylie tonight, pick a restaurant we both haven’t been to, but one our individual preferences suggest we’ll both like”. If an automated database can remove the legwork of answering these basic questions (that already have an answer if you just know where to look), more time can be spent on actually interacting and spending time with one another.

Of course due to natural language processing constraints and machine learning model training-time this full vision is at least a couple years out, according to Messina. But they have to start somewhere, so the first iteration of Molly is an AMA (ask me anything) feature where audiences can find answers about a certain person, with those answers being aggregated from a wide variety of sources like Medium, Twitter and Instagram.

Right now they’re only launching with featured profiles (but anyone can ask questions) who have done a Product Hunt AMA in the past, since this is an easy database of questions and answers they can scrape to pre-populate a lot of information. Molly will also occasionally will send users quizzes to take, with the answers being being recorded in their knowledge bank for future audiences to access.

And lastly there’s a feature for questions that Molly can’t answer to automatically be forwarded along to users, so high-profile people can have a centralized place to take all questions from fans (and not have to answer duplicates, since Molly will automatically find the answer if its already been asked).

Eventually the plan is to open up profiles so anyone can have a database of answers and preferences for friends to access. And these don’t all have to be totally public – Messina envisions a potential one-time permission-based system where you could grant a friend access to Molly just for specific purposes and a set period of time, like finding a restaurant for tonight.

Right now Molly’s founders have said it’s too early to think about monetization, and they’re focused on finding product-market fit. But Messina hinted that (way) future versions could use the knowledge base they’ve built up to recommend restaurants or bars they’ll know you like – which could be a future source of revenue.

The startup has raised $1.5M from BBG, Betaworks, Crunchfund,  halogen ventures.



Source link

Join me for an evening of crypto with writers Paul Vigna and Michael Casey



Next week I’m pleased to announce that I’m going to have Paul Vigna and Michael Casey, authors of The Truth Machine, on stage with me at Knotel, a co-working and event space in Manhattan. I’d love for you to come.

You can RSVP here. It’s happening on February 28 at 7pm and it will feature a 35-minute talk with two of the top writers in crypto. These guys literally wrote the book on Bitcoin and their new book is about to hit store shelves. They were kind enough to give us a few minutes of their time to talk about the book, ICOs, and the future of crypto. It’s going to be great.

Please sign up now as space is limited. I’ll see you next week!



Source link