First published on the Actuate blog on February 4, 2015.
Think back to the last big purchase you made. Maybe you bought or rented a home, purchased a car, or chose a new computer or mobile provider. If you’re a smart shopper, you considered your decision from many angles: Does the product or service meet my needs simply and elegantly? Is the manufacturer solid and reliable? Will my choice serve me well into the future?
We face similar questions when we decide on a third-party vendor to embed technology in the apps we build: Is the technology powerful enough? Is it easy to embed? Will the vendor be around in the future? Will the technology evolve and improve as my needs – and those of my customers – change over time?
Elcom International faced such a decision almost a decade ago. Elcom’s software product, PECOS, digitizes the procurement process; Kevin Larnach, the company’s Executive Vice President of Operations, describes PECOS as “Amazon for business,” with extensive controls and business process integrations required by leading governments, utilities, businesses and healthcare providers. More than 120,000 suppliers provide products in PECOS through Elcom’s global supplier network, and PECOS is used by more than 200 organizations worldwide to manage more than $15 billion in total spending annually.
First published on the Actuate blog on December 22, 2014. I’m sharing it here not just as a writing sample, but to illustrate how my colleagues and I grew the company blog from a disjointed, ill-used channel into a lively and growing community. I not only write for the blog, but also serve as writing coach and editor for technical and business staff who want to contribute. The Actuate blog, apart from being a content channel unto itself, is a primary source of content for the company’s other social media channels, including Twitter, Facebook, LinkedIn, and Google Plus.
The last days of 2014 are upon us. It’s been a big year at Actuate, and one thing we’re proud of (among many – more about that next week) is the growth of this blog. Thank you for reading, commenting and sharing.
A little background: In April 2014 we consolidated several different company blogs onto a single platform and expanded the ranks of our contributors. We now have a dozen great people sharing news and information about embedded analytics, customer communications, document accessibility, data visualization, events, webinars and much more, all in one place.
Our team now strives to publish new content every workday. Since we ramped up our efforts, we hit that goal every month except August, on average. (Summer vacations, anyone?) This graph shows the number of blog posts we published per month in 2014.
(First published on the Acutate blog on June 10, 2014.)
If two heads are better than one, how much better are eleven hundred heads?
We believe they’re abundantly better, and that’s why we’re proud to share some highlights from a recent report that’s based on interviews with more than 1,100 Business Intelligence (BI) users worldwide.
(Published on the Actuate blog on May 28, 2014. It’s not every day that you get to write a lede like this for a tech company. I also wrote the infographic it’s based on.)
All hail the humble bucket. Buckets may not be fancy, but they’re useful, flexible and practical. We should all be so lucky.
Buckets also are ubiquitous: The paint on our walls and some of the food on our plates likely spent time in a bucket. Golfers relax at the driving range with a bucket of balls. Bikers, skiers and skaters affectionately call their helmets “brain buckets.” Coders use bucket sort to get numbers in sequence. We contemplate our future by compiling a bucket list.
Enterprise application design is changing. Are your development practices changing with it?
That question was the crux of a recent webinar given by Allen Bonde, Actuate’s vice president of product marketing and innovation. Building on more than two decades of experience as a developer and analyst, Bonde shared 5 Best Practices for Designing Data-Driven Apps that developers and their leaders need to follow in a webinar on May 14, 2014, hosted by SD Times.
Have you recently wrestled with a Rubik’s Cube? Struggled to solder? Prototyped with a 3D printer? Deployed a drone? Then maybe I saw you at the Maker Faire this last weekend.
The Maker Faire is a spinoff event of O’Reilly’s Makemagazine. A hybrid science fair/county fair, Maker Faire brings together thousands of hackers, builders, artists, cooks, gardeners, crafters, and others, all eager to strut their stuff and learn from each other in a spirit of open collaboration. There are 3D printers, Tesla coils, and crazy vehicles galore. And there’s stuff to gawk at, such as a 26-foot-tall fire-spouting metal octopus made of scrap metal, a drone battle zone, and a musical stage powered by listeners – the faster the audience pedals, the louder the music.
It’s a lot of fun, but the Maker Faire has a serious side, too: To encourage experimentation, creativity, and problem-solving. Many exhibits and activities are made for kids, and some of them are made by kids. My favorite of those featured 12-year-old Saurabh Narain, who has built a robot based on Lego Mindstorms EV3 Intelligent Brick that can solve a Rubik’s Cube in 30 moves and less than two minutes. You have to see it to believe it. But don’t feel bad about being bested by a ‘bot: Saurabh’s proud father admitted to me that he can’t solve the Rubik’s Cube either.
(Another one that’s lost a bit of its currency, though this is an annual event. This was my first post on the Actuate blog on April 8, 2014.)
This weekend—April 12 – 13, 2014—marks NASA’s second International Space Apps Challenge. Online and at almost 100 sites around the world—from Athens, Greece, to Zaragoza, Spain—volunteer developers, makers, and scientists will collaborate in an intense 48-hour marathon to create solutions based on 40 different challenges posed by the American space agency.
Some of the challenges involve hardware, while others are purely software development projects. This year, the challenges range from creating a greenhouse design (for growing food on Mars) to developing an app to display the force of gravity anywhere on Earth.
(OK, this one’s a month old, but every day is tax day, right? Published on the Actuate blog on April 15, 2014.)
It’s April 15, and in the United States that means one thing: Taxes.
Federal income tax returns are due with the Internal Revenue Service, or IRS. The IRS says about 35 million Americans have waited until the final week to submit their tax returns this year, so lots of us have taxes on the brain right now.
Being taxman isn’t a job for the insecure. In 2013, 40 percent of Americans had an unfavorable view of the IRS. But love ‘em or hate ‘em, the IRS is a data geek’s dream source. By law, huge volumes of tax data are public record and available online. It’s yours to play with; after all, you paid for it.
(Published on the Actuate blog on April 21, 2014.)
Hockey’s Stanley Cup playoffs began last Thursday, and most hockey fans –particularly those whose teams are in the hunt – are in heaven. But hockey fans who also are students of analytics and data visualization are in a sort of purgatory, because hockey lags behind other sports in adopting advanced statistical analysis of players and teams. Put another way: There’s no Moneyball for hockey.
Don’t take my word for it. A panel at last month’s MIT Sloan Sports Analytics Conference (called Hockey Analytics: Out of the Ice Age) noted that the rink “remains a less-than-forgiving climate for perfected analytical judgment and implementation.” That’s changing, though: Conference organizers also said “in-depth statistical analysis has permeated the front office of NHL teams” and noted that fans are getting in on the data action.
(Actually, it’s more than seven — we’ve added a few in the comments.)
We all know data science is an important (and growing) field, but most of us don’t have the time or money to study it full-time at a university. But take heart: We’ve found seven free or inexpensive options for data science training.
Many of these are high-level, theoretical courses, not specific programs to train you on data science tools. All the courses require some knowledge of statistics, programming, and data concepts (and one of them requires a lot more), so read the fine print.