Total Pageviews

Drones and Robots Come Out to Play at Sparkfun’s 6th Annual Autonomous Vehicle Competition

Sparkfun AVC 2014

Sparkfun Electronics held their 6th annual Autonomous Vehicle Competition last weekend, and this year was bigger than ever before. The action was at Boulder Reservoir in Colorado, but anyone could follow along (with a few technical difficulties) on the YouTube LiveStream. (Part 1), and (Part 2).

The story of the day was Team SHARC’s Troubled Child, which won the ground vehicle doping class. Rather than mess around with miniature cars, Team SHARC built their ‘bot out of a freaking Jeep, a 1986 Jeep Grand Wagoneer to be exact. Troubled Child had no problem getting around the course. One could say it carried the entire team. Literally – the rest of Team SHARC’s robots are riding along on top of Troubled Child in the picture up there.

There was also plenty of action in the aerial competition. Sir Crash-a-Lot was the first drone to find a watery doom at Boulder Reservoir. The last we saw of it on the stream, the team was looking for some divers.

Aircraft can not be hand launched at the AVC. Not a problem for rotary-winged vehicles, but this rule has led to some interesting solutions for fixed wing aircraft. The disguised “Team Falcon” showed up with an incredible compressed air launcher, which used a gallon water jug to fire their delta-winged plane to a clean run.  Team Karma550 wasn’t quite as lucky, with their helicopter crashing hard, and throwing up quite a bit of smoke.

We’re still waiting for more detailed results, but if you want the full scores, they are available on Sparkfun’s AVC scoreboard page.


Filed under: drone hacks, robots hacks

Programmable Logic I – PLA/PAL

C64-B

Yeah I am still a little pissed that the competition is still around and we aren’t, and by "we" I mean Commodore Business Machines (CBM). It was Commodore that had the most popular home computer ever in the C64 (27 Million) and it was a team of MOS engineers after all, that had the idea to make a "micro" processor out of a 12 square inch PCB.

MOS Technologies logo and address

MOS Technology in King of Prussia/Norristown

Of course they did work at Motorola at the time and “Mot” did not want anything to do with a reduction of the profit margin on the pie-plate size processor. Of course MOS got sued by Motorola but that was an average Tuesday at MOS/CBM. I absolutely credit CBM with buying the MOS Technologies chip foundry, as together we could make our own processors, graphics chips, sound chips, memory controllers, and programmable logic.

With this arsenal at our call we didn’t have to make compromises the way other companies did such as conforming to the bus spec of an industrial standard 6845 or having to add extra logic when a custom extra pin would work. We could also make sprites.

6502 Design Team

6502 Design Team (EE Times 1975, archive.archaeology.org)

The compromise we did have to make when designing was cost, and I mean the kind of cost reduction where finding a way to save a dollar ($1USD) saved millions in the production run. I knocked $.90USD out of a transformer one day and I couldn’t focus the rest of the day due to elation.

Cost reduction is a harsh mistress however as you can't just do it a little some of the time or only when you want to. The mental exercise of multiplying anything times a million was always there, it made it hard to buy lunch — I’d be blocking the lunch line while figuring the cost of a million tuna sandwiches FOB Tokyo

To offset stringent cost control we had massive quantity discounts. Calling a local rep to say that we were designing in the recent 16kx4 DRAM was known as "the call", and inevitably the doorbell would ring 2 minutes after hanging up the phone.

In a rare event we lost our management when [Mr. Jack Tramiel] quit and then later ended up at Atari — picture the Borg knowing everything Cpt. Picard does. During the time of "no management" a couple of things happened: the number of cases of beer consumed on premises skyrocketed, and the engineering department designed a computer to fill the void without direction from management or marketing, at least for a while.

This meant that the design ultimately came from a piece of grid paper instead of a few dozen meetings discussing customer focus and the cost of a million tuna sandwiches. The shopping list included a memory manager and the biggest PLA we had done to date. It should be noted that no one in chip design says "no" or "we can't do it" at this stage. That’s because we just don't yet know. But if the engineer making the requests doesn’t want to later be done in by the laws of physics, he/she had better keep the requests within the realm of "almost doable".

In the Commodore 64 there was a PLA modeled after the Signetics 82S100 which was a "blown fuse" technology for programming a simple AND-OR array. When I say blown fuse I am not kidding, my memory is that part of the parameters was not to blow the fuse using excessive power as it my "splatter".

PLA Die C128.com,

PLA Die C128.com,

Ultimately Commodore and MOS Technologies made their own NMOS version, I won't say we copied the Signetics part, we just took lots of Polaroids through a microscope and had them mounted to a board. Under the covers we replaced two kinds of fuses, "and term" fuses and "or term", with a diffusion layer "slug" and also a pre-ohmic contact. It was one-to-one so the code that worked on an 82S100 could be sent over and then someone could run a script to convert. An excellent write up on Commodore's C64 PLA down by [Thomas 'skoe' Giesel] embodied in this PDF (also mirrored here).

The initials of departed friends still visible on the chip. The PLA is an AND-OR array meaning that each line of terms that are true — they satisfy the logic of “This and That but Not That over there” — are OR'ed together depending on which output you want to affect. OR’ing is like a dogpile, the chip doesn’t care which of the strings of terms caused the true condition, it just knows that one or more did.

PLA Worksheet

Example of the worksheet engineers used to create logic terms.

We pushed this device for everything it was worth, I had asked for a little over twice the terms which means the array, being twice as big, had twice the resistance and twice the capacitance. This translates to 4 times the RC component. That aside and I also won't bore with details of how the designer tried to insert a change without telling anybody and shorted all of the input pins to the back bias ring with 2 weeks to go to CES. Simply put we shoved as much into the chip as we could and we wouldn't have had a viable computer without it. Mass quantity production pricing brought the cost down to pennies and nickels, even after the $2.5million NRE cost to design.

PLA And Or Array

PLA And Or Array

So the PLA was a low cost, high return hit. We smiled. Then we looked at what we call the jungle logic left on the board and our brows furrowed. This collection of gates would be prime candidates for the Programmable Logic Device (PLD) of the day, especially in the area where we were trying to make a 6502 microprocessor play nice with a Z80 microprocessor all while not blowing the DRAM up. It would also have been ideal for the final stages of FCC application where you can only make minimal visible changes to the board without re-submitting. There was no joy in Mudville however, as the price of PLDs was prohibitive at $4-$5USD. Buying large quantities didn’t result in large discounts, just large bills and logistics issues such as programming them. At this point I got my hands on a confidential stock and price sheet and picked the jungle logic from what was already in stock providing there were hundreds of thousands available of any one part.

PLD Architecture

PLD architecture showing registers and feedback.

What made the PLA powerful was it was relatively fast since the internal array was small, and there was the addition of registers and feedback terms which are the ingredients of State machines (we will discuss Mealy and Moore state machines at a later date.) I believe CBM/MOS did try to do an NMOS version of the PLA but the limitations of the NMOS process limited usability. In the next installment — which will be published early tomorrow — I talk about more modern Complex Programmable Logic Devices and program one from start to finish. Instead of doing gate reduction, the modern devices quite simply allow a designer to implement macro functions at any quantity that simple couldn’t be done otherwise as "jungle logic" would be too slow and error prone.

CPLD Device showing assignments.

CPLD Device showing assignments.


Filed under: classic hacks, Featured

How to know if you’re learning the right programming language

GUEST POST

How to know if you're learning the right programming language
Image Credit: Photobank gallery/Shutterstock

It's a question more and more people are beginning to ask themselves: Which programming language should I learn first? With hundreds of options to choose from, it is understandably easy for beginners to get overwhelmed. How do you know if you are learning the "right" language? How do you know if you are heading down a path that will give you a foundation to land a great job in the future?

Despite all the noise and nuanced opinions you'll read about what your first programming language should be, the best answer these days is simple:

Learn JavaScript.

If you don't know algebra yet, then start with Scratch.

I know. I know. I'm supposed to caveat my answer to this question with "If you like X, then try Y" or "If you are thinking about A, then try B." Those kinds of caveats come later. For now, let me back up and explain why JavaScript is the winner.

I first learned programming in elementary school via a Scratch-like language called Logo. After I was taught algebra in junior high, I learned BASIC, which introduced me to, well, the basics. In high school, in between programming silly games and animations on my TI-83 calculator, I learned Pascal in computer science class.

Even though Pascal was interesting, I was discouraged by its lack of obvious real-world applications. Even though I tried to make a comeback by learning C in college, the biological sciences had already taken a firm hold of my interest. I didn't come back to programming until grad school, when I learned Ruby as I began building web apps on the side and needed a scripting language to efficiently work with data in my research.

I didn't fully realize how remarkably important JavaScript was until many years into building web apps. As web browsers became increasingly powerful and users demanded more desktop-like experiences, JavaScript became even more essential for web developers.

These days I recommend that students first learn algebra well (teachers, this is incredibly important!) and then jump straight into JavaScript. This combination allows students to learn the fundamentals of computer science and how to begin writing software, and it helps that JavaScript connects to plenty of real-world applications (web browsers, web servers, robotics, and more).

Ok, What's Next?

From here, what you should learn depends on what you want to build. If you're not sure, try experimenting with each category below and find something you enjoy.

If you want to write code for something other than web or mobile applications …

… then your next language should be Java or C. (Java is also used by web and mobile developers, so if you change your mind later, you can easily switch over).

Java, not to be confused with JavaScript, has become the standard programming language in introductory computer science sources in colleges around the world. It is a heavyweight, broadly adopted, object-oriented language that can be used for almost anything. Once you have a foundation in Java, continue forward with Groovy, Scala, and/or Clojure to build out your arsenal with modern-day scripting and concurrency-optimized (i.e. performs well at scale) languages.

C, on the other hand, is one of the most widely adopted languages of all time and is the foundation for most operating systems and higher-level languages used today. You should learn C if you want to be interfacing more with hardware and/or need to be at the cutting edge of computational optimization. After learning C, you should learn C++, which will give you even more appreciation for C, allow you to build even more amazing software, and set you up well to learn other programming languages.

If you want to build mobile apps …

… then, in light of Apple's latest announcement and especially if you're just getting started, you should begin learning Swift (within Xcode 6, currently in beta).

If you'd prefer to start with Android development, then you'll want to dive into Java, described above, which also gives you plenty of options for other types of software development as well.

If you want to build websites …

… then for now you can skip Java, C, and Swift and dive right into HTML and CSS. Learn how to build clean user interfaces (UIs) in a web browser as you continue to write JavaScript to enrich user experiences (UX).

If you absolutely love doing the layout work, and especially if you have strong visual design skills, then you should seriously consider laser-beaming your efforts toward becoming a "Front-End UI/UX Web Developer." You will need to learn CSS extraordinarily well (including, ideally, a CSS-extension language called Sass). Your JavaScript and HTML will also need to follow suit and be equally as strong. There are shockingly few people who can design fantastic UIs/UX in web browsers, so if you choose this path and do it well, everyone will love you.

However, if you want to dive more into the how-to of bringing lots of data into your website (i.e. you are not satisfied with WISIWIG platforms like Tumblr/SquareSpace, or off-the-shelf Content Management Solutions like WordPress), then you are heading in the direction of becoming a "Web Developer," and your main options these days are Java, C#, Ruby, Python, and/or sticking with JavaScript. Each language has a variety of web development frameworks built on top of it; which one you use depends on a ton of different factors that I won't get into here. The language you pick will then teach you how to work properly to store data in databases and use languages such as SQL.

Java has a well-established community and, as I mentioned above, has options to do pretty much everything. It also puts you ahead of the game if you want to eventually write Android apps (which are built using Java).

C# (pronounced "C sharp") launches you into the Microsoft world, which — as you likely know — is a big world and has plenty of options for web development, especially within the .NET framework.

Ruby is much easier than Java or C# to get into, but it narrows your scope to mainly web apps and miscellaneous scripting tasks (although not exclusively). Ruby is famous for bringing "ease through conceptual elegance" and if you see yourself working for startups, or building one (or five) of your own, Ruby on Rails is a great option.  I write with Rails and JavaScript on a daily basis and absolutely love it.

Python gives you everything Ruby does, plus a long-standing set of tools/libraries for more academic and scientific applications. The slight "cost" of this robustness, however, is that it is arguably less easy to learn (which isn't necessarily a bad thing). Still, the language prides itself on bringing "ease through consistency." The Python community is undeniably steady, which is extremely important when deciding what programming language to learn.

Finally, sticking with JavaScript to also do your back-end work is growing rapidly in popularity. Web apps are increasingly JavaScript-rich, which means all web developers need to be proficient at it. This is starting to create a natural unity and efficiency in the industry because professionals are able to write "full-stack" in a common language.

Whatever path you choose, start writing your own software as soon as possible

Too many beginners get stuck following (and getting frustrated with) tutorials. It's easy to copy and paste someone else's work and not have a clue what is going on under the hood.

The best solution to this problem is to think of an interesting project to work on and get after it. If you want to use an Arduino to build an interesting sensor or robot, go for it (it will force you to learn C/C++). If you want to build an app for your iPhone, then think about exactly how you want it to look and feel, and use tutorials and online resources to help you hook it up (for this, you'll need to learn Swift). Or if you want to build the next awesome web application, work to make it look exactly how you want it to in the web browser across various devices (which involves HTML and CSS). From there, to bring it alive and give your users the ability to add, modify, and view lots of data, you'll need to learn a web development framework (which will force you to learn its underlying language).

Finally, writing code is much more of an artistic craft than most people realize. It takes a significant number of years to get to a proficient level and first requires being a good apprentice under a community of developers to ensure you are learning the trade properly. If you are able to earn an advanced degree in computer science, you should. If not, get as far down the road as you possibly can with online resources, and sooner rather than later you should seek to make friends with programmers who can get to know you, evaluate your work, and help guide you in a fulfilling direction.

Will Little is co-founder and CEO of Code Fellows, a Seattle-based digital trade school that guarantees jobs to graduates of its intensive, eight-week Development Accelerator program. Will holds a Ph.D. in Bioengineering from ETH Zurich and has worked professionally as a Web developer and tech entrepreneur since 2005.


Use a free or cheap marketing automation system? Tell us what's great about it (and not so great), and we'll share survey data from everyone else with you.







SpoonRocket hires a data scientist as it plans to expand to more cities

SpoonRocket hires a data scientist as it plans to expand to more cities
Image Credit: SpoonRocket

SpoonRocket has just hired a data scientist, Arif Ali. His work will help guide the company's expansion to other cities on the west coast.

SpoonRocket engineers used to do data science part-time. Ali is the startup’s first dedicated data scientist.

"It's definitely one of those things we are trying to be ahead of the game to maintain our advantage," said SpoonRocket co-founder Anson Tsui in an interview with VentureBeat. "Without a data scientist, you are basically driving blind."

The startup is one of many that has brought data scientists aboard lately. Others include Lyft and Yplan.

According to Tsui, data science is important for SpoonRocket in three ways.

One is growth.

SpoonRocket uses data science to better analyze its customer behavior and order pattern so as to increase retention rate and to drive user growth.

Two is internal operation.

Forecasting demand, estimating wait time, and routing — in the sense of how to drive efficiently and effectively — are the things SpoonRocket is doing with data science, Tsui said.

"We are actually very much a logistic company, now we are just food," said Tsui. "There's a lot of advanced stuff in the background to be able to deliver in 10 minutes."

Three is expansion.

"We are looking at a couple places right now. This is where really Arif comes into play, to really figure out what will be the smartest next step," said Tsui. "Because obviously, as a company I can be in a lot of different cities, but it's most important to prioritize, to figure out which city is gonna be the most impactful for our business."

Los Angeles, San Diego, Houston, and Seattle are all on SpoonRocket's radar. "We are definitely looking at a couple of the cities on the west coast," said Tsui. "We haven't made the decision yet, but it's definitely something we are working on."

As for Ali, he just graduated from University of California, Berkeley, with a bachelor’s degree in statistics.

When SpoonRocket's co-founders were running the not-so-healthy-food-delivery service Late Night Option for UC Berkeley students, they recruited Ali to analyze Facebook marketing data as a summer lead operations intern.

Before joining SpoonRocket, Ali gained more experience in data science by doing other summer data internship and working as a data analysis research assistant.

Spoonrocket, based in Berkeley, Calif., announced last month that it had taken on $11 million in venture funding.


Use a free or cheap marketing automation system? Tell us what's great about it (and not so great), and we'll share survey data from everyone else with you.


SpoonRocket, Inc. has expanded its services to downtown Oakland and the Lake Merritt area. SpoonRocket serves up two dishes a day -- one vegetarian option and one meat-based meal -- and delivers for free to certain parts of the East Ba... read more »








Google and Carnegie Mellon University researchers unite to change the e-learning landscape

Google and Carnegie Mellon University researchers unite to change the e-learning landscape
Image Credit: jcjgphotography/Shutterstock

Google is going to school.

Google’s research and special initiatives group is working with Carnegie Mellon University for an ambitious project that aims to lessen the dropout rates for students taking online courses and to overhaul the way people perceive MOOCs, or massively open online courses.

Google is sponsoring the project and committing resources — $300,000, for example — through Google’s Focused Research Award, to the endeavor. As it stands now, CMU researchers said the current dropout rates for students taking online courses is high because many lack an effective methodology of keeping students interested, unlike so-called traditional “brick and mortar” classes, according to the team’s research.

“Ninety percent of people that sign up for MOOCs never finish the course. We’re trying to build an intervention in order to keep the engagements up,” said CMU professor Robert Kraut of the school’s well-regarded Human-Computer Interaction Institute.

CMU professor Robert Kraut

Above: CMU professor Robert Kraut

Image Credit: Robert Kraut

According to CMU researchers in a release announcing the funding:

“The research plan includes development of techniques for automatically analyzing and providing feedback on student work, for creating social ties between learners and for designing MOOCS that are effective for students with a variety of cultural backgrounds.”

“I think we need to enable MOOCs to be more immersive, adaptive, and social,” Alfred Spector, Google’s VP for research and special initiatives, told VentureBeat.

Specifically, CMU’s associate vice provost of tech strategy and impact Justine Cassell said MOOCs today generally translate to a “lecture-style presentation” with precious little interaction with teachers. She noted the learning gains are relatively modest for those students who stick with the course for its entirety. Unless MOOCs pay attention to the actual way students learn in a real classroom, e-learning stands itself a chance of becoming what she called a passing fad.

Kraut agreed.

“One of the major problems about the hype with MOOCs is the potential to reach many people. We’re trying to understand what kind of problems to give students so when they work on the problems it improves the learning outcome,” Kraut said.

The initiative is broken down into three facets. The first entails using computerized processes to personalize the MOOC experience, with data-driven evaluations of students work and pinpointing subjects mastered by the individuals in question. In this case, the data pertains to individual students monitored by teachers. Secondly, researchers will seek ways to lessen the very real problem of student attrition as experts seek to personalize the “socialization” facet of education because the courses are virtual.

Data will also indicate whether a student is in danger of dropping out, and project leaders will take action accordingly to reengage in an attempt to hold their interest.

The third facet is making available course work more interactive and engaging. This area of the project is also keen to evolve with students from other cultures whose educational systems may be different from in the States.

“CMU is definitely forward-thinking. With their extensive previous experience, we think they'll do great work on modeling learners, and making technology-based education more adaptive,” Spector said.


Pioneering Solutions for the World Carnegie Mellon University is a global research university with more than 11,000 students, 86,500 alumni, and 4,000 faculty and staff. Recognized for its world-class arts and technology programs, col... read more »








New FAA doc forbids drones from delivering Amazon packages, beer, & flowers

New FAA doc forbids drones from delivering Amazon packages, beer, & flowers

Above: Not today, Amazon.

Image Credit: Amazon

Has the Federal Aviation Administration (FAA) rendered a new decision about Amazon’s planned drone delivery service?

A new document from the federal agency brought to light yesterday for public comment is gaining attention since it indirectly prohibits the kind of super cool drone delivery service retail giant Amazon first presented to the world in a 60 Minutes story earlier this year.

The 17-page document relates primarily to the agency’s interpretation of drones that qualify as “model aircraft.” This is particularly important because the FAA is prohibited from issuing any rule or regulation about a model aircraft.

In addition to certain specs –– the weight of the device, for instance –– the model aircraft must be flown strictly for “hobby or recreational use.” The document includes a table showing activities that are clearly “hobby or recreation,” and those that are not:

From the FAA document

Above: From the FAA document

Image Credit: FAA

One of the “Not Hobby or Recreation” activities — “delivering packages to people for a fee” — is being taken as a specific slam on Amazon’s proposal. (A footnote highlights that free shipping by a business is the same thing.)

“The information in the recent FAA notice is not new,” FAA spokesperson Alison Duquette told VentureBeat today. “It simply clarifies the ‘do's and don'ts’ for model aircraft operators so they can fly safely in accordance with the FAA Modernization and Reform Act of 2012.”

Matt Waite, head of the Drone Journalism Lab at the University of Nebraska, agrees.

“It’s never been under debate whether [Amazon drone delivery] was allowed,” Waite told VentureBeat, noting that CEO Jeff Bezos has acknowledged as much. “Amazon already knew it had to wait.”

Instead of targeting Amazon, Waite said, the new document is really aimed at several small companies that have been using drones commercially.

He pointed to the Lake Maid Beer Co. in northern Minnesota. That company was conducting a specifically Minnesota-style version of a drone delivery service, which air-shuttled beer to ice fisherman. (Now, there’s going to be some — how shall we put it? — dry ice up north.)

Other small businesses being targeted in the document include a drone delivery service for flowers near Detroit and a pharmaceuticals-by-drone operation in San Francisco, Waite said.

Via Ars Technica


Use a free or cheap marketing automation system? Tell us what's great about it (and not so great), and we'll share survey data from everyone else with you.


Amazon.com, Inc. (NASDAQ: AMZN), a Fortune 500 company based in Seattle, opened on the World Wide Web in July 1995 and today offers Earth's Biggest Selection. Amazon.com, Inc. seeks to be Earth's most customer-centric company, where cu... read more »