Archive for the ‘Start-ups’ Category

When to Pay Attention to Competitors

Startups get conflicting advice about how to deal with competitors. Well known advisors recommend “use your competitor’s products every day” and others mandate “define your point of view, chart your course, and ignore the competition.” The best approach is a balanced implementation of both schools of thought. The key is to know when to shift from the first strategy to the second.

As a startup founder I know all too well the “push / pull” competition tracking can have on staying to true to a vision, versus realizing that another team might have figured something out before your team did. And all the distraction that ensues reacting to every competitor’s market moves. I realized I was falling into the trap of paying too much attention to what the competition is doing and losing focus on our core ethos.

hypecycle-competitorsAn “easy” rule of thumb is to correlate competition tracking efforts to the stage of the business. The Gartner Hype Cycle is a visual way to correlate when to track competition as your project matures from the “Innovation Trigger” to the “Plateau of Productivity.”

Read more…

Abundant Innovation – Sonian Summer 2011 CodeFest Delivers Impressive Results

The first quarterly all-engineering code fest completed Tuesday (Aug 16, 2011) evening with 3 winning teams, one dramatic performance, and many laughs.

This post is linked to the Sonian Blog. Joe Kinsella, Sonian VP Engineering, wrote about the CodeFest here.

The entire company was invited to view the presentations and vote for their favorites. The only voting rule was you can’t vote for your own team. The judging was based on three criteria: 1. Impact on solving a Sonian or customer pain point (50%), 2. “Cool-ness” factor (25%), and 3. Presentation style and effectiveness to convey the idea (25%).

Thirteen teams competed, representing the four functional units in the Sonian Engineering organization; SAFE (back-end), Website (front-end), DevOps (systems management) and QA. There were several teams from each group. The themes each team chose ranged from automation, performance measurement, to UI beautification and speed. Each team gravitated toward their “natural” inclinations. The DevOps teams focused on automating manual tasks and removing friction from deployments. The SAFE team (back-end) showcased applying “math” to measuring performance and data classification. The website team looked at speed and a better user experience, and the QA team showed us new ways to think about cost-testing alongside bug testing.

Six teams had a metrics or analytics theme. Two teams focused on user interface improvements, and 4 teams came up with solutions for automation and deployment problems.

Instead of Ernst and Young tallying the votes, our Harvard MBA trained ROI analyst Chris H. stepped in to ensure a fair and accurate accounting.

And thanks to all the non-technical folks who sat patiently through presentations where terms like “latency,” “lazy loading,” “grepping logs” and “foreground queues” were discussed.

Teams chose their presentation order, and the QA team volunteered to go first. Below is an accounting of each presentation with some context on how the idea fits into Sonian’s needs and long-term vision.

Congratulations to all the teams who competed! The next CodeFest is sure to be another interesting event.

Team 1: “You paid what for that …. Export job, Object list request, or ES cluster?”

Andrea, Gopal, Bryan and Jitesh from the quality assurance team got together around an idea to extend testing methodologies into infrastructure cost analysis. In order to maximize the cloud’s economic advantage, the engineering team is always thinking about the cost of software operating a “big data scale” levels of activity. From architecture to implementation, the goal is to infuse “cost conscious” at every level. The QA team came up with a novel idea on this theme.

The proposed idea is to extend the testing framework to set a baseline of feature infrastructure costs, and then measure successive releases against the baseline. A significant cost deviation from the baseline could be considered a design flaw, implementation error or a SEV1 bug. Some sample features with measurable costs would be an import job, export request, or a re-index. Over time the entire app suite could have an expense profile established.

Having QA be an additional “cost analysis layer” in the full development cycle will only help make the Sonian software as efficient as possible.

Bonus points to the team for the most elaborate props and “dramatic performance” used in their presentation.

Read on for details on the twelve other teams

Read more…