Posts Tagged ‘Business Intelligence’

Periscope Data is a cloud-based business intelligence analytics and distribution platform. Periscope Data has taken the pain out of data loading by directly connecting to your data sources with no messy ETLs.

Periscope visualizes your data into charts, graphs and dashboards. All you need to do is to write SQL queries in Periscope and it returns charts and reports and dashboards that you can share or embed.

Periscope is licensed by the number of data rows you share with Periscope. You can have unlimited users. Your Periscope package includes Unlimited Charts, Unlimited Users, Dashboards, Unlimited Embedding and white-labeling, and Unlimited Support.

Pricing of packages start at $1,000 a month for up to 1 Billion rows of data and scale linearly from there. There is no annual commitment, you can pay month to month.

You can take advantage the Periscope caching tool at no additional cost. Caching reduces load on your database, results in faster performance and gives you the ability to upload csv’s and do cross database joins. Your query speeds will run 150x times faster with Periscope caching.

https://www.periscopedata.com/ http://wiki.glitchdata.com/index.php?title=Periscope_Data

GoodData is a company that has been on the data scene since 2007. Founded by Roman Stanek the former CEO of NetBeans, and Systinet, GoodData seems to be in good hands. Roman Stanek sold NetBeans previous to Sun Microsystems in 1999, and Systinet to HP 2006.

GoodData has raised USD53.5M in venture funding from the likes of Andreessen Horowitz, Tim O’Reilly, AlphaTech Ventures, General Catalyst Partners, Windcrest Partners, Intel Capital and TOTVS. It employs 291 staff across 5 offices in Prague, Brno, San Francisco, Portland, and Boston.

GoodData has a joint venture with Chris Gartlan to grow an APAC presence. Based in Melbourne-Australia, GoodData APAC has a team of 10 staff focused on growing the business.

So what is the GoodData value-proposition? It’s simply a fully managed cloud-based business intelligence platform. GoodData does it end-to-end taking on the capital costs of building data-warehouses and data-marts and providing speed and agility in delivering results.

These results are actionable insights which under traditional data integration would cost anywhere from 7x to 15x. So whether you run lean OPEX or CAPEX, this solution can be tailored to your requirements.

Agility comes in the form of a managed solution. Business Units can now independently build datamarts, and visualise data. This is where cloud-based BI performs.

So what are GoodData’s strengths with such a broad focus across a very big data chain? Customer-focus seems to be the key. Even with a fully out-of-the box solution, GoodData is agile enough to custom-fit various parts of the datachain. This could be data integration, data storage systems, to visualisation components.

 

Outsourced cloud-based BI is the new spin on the disk.

 

 

 

If you haven’t heard of Yellowfin BI, it is a passionate startup focused on making Business Intelligence easy. Established in 2003, Yellowfin has been developed to satisfy a range of BI needs, from small businesses, to massive enterprise deployments and software vendors.

Yellowfin makes a Business Intelligence platform built ontop of Tomcat/Java that processes and presents information in refreshing detail. Its easy to assemble, and allows you to focus on building new business value rapidly. Yellowfin can be deployed on any server (cloud or on premise).

Yellowfin is the second Australian vendor to ever get in the Gartner Magic Quadrant.

Growing organically, it can barely be called a startup these days with >100 employees and offices in 4 different countries. Yellowfin is running a series of presentation of their technology in December. These are:

Melbourne – 1 Dec Sydney – 2 Dec Auckland – 3 Dec

Register for the event today!

Gartner is hosting the Business Intelligence & Information Management (BIIM) Summit. A must attend event for BI/IM folks who are looking into the evolution of BI/IM to provide value to their organistions. What is interesting are topics like “the last mile for BI” and the move to “predictive and prescriptive” data.

Two thoughts come to mind when I look at the details of this conference.

For alot of organisations, they are very very far from “the last mile”. Without methodical and data-centric approach to BI, many organisations are stuck in no-man’s land. The chaos of semi-defined masterdata, a plethora of link tables/translation tables, poor definition of transactional and analytical datasets, half-fuel performance at all levels of the BI stack, and probably significantly under-performing BI. Naturally business owners want to get value from their BI/IM investments. Traditionally, this has been considered an important and justifiable cost to the organisation especially for decision making, but its increasingly important to extend that capability into future scenarios. Future scenarios involve the unknown. Unknown datasets, unknown parameters, unknown visualisations, unknown BI/IM capability. These are the topics that I would like to see from this conference.

Anyway, I’ll be attending the conference. Email me if you want to meet.

Conference Tracks

Trends and Futures Information Innovation Performance Management Social and Big Data Virtual Tracks

SAP re-launches the <a href=”http://www.sap.com/solutions/technology/in-memory-computing-platform/hana/overview/index.epx”>HANA</a> (<strong>H</strong>igh-performance <strong>AN</strong>alytic <strong>A</strong>ppliance) platform in 2012 and looks to this as the “game changing” technology for BI/DW/analytics. But is it?

Driven by the corporate demand for real time analytics, the HANA platform seeks to put data into memory and dramatically improve performance. This will help address the demand for big data, predictive capabilities, and text-mining capabilities.

But doesn’t this sounds like the typical rhetoric from computing vendors that previously addressed technology issues by recommending the addition of more CPU, or RAM, or disk space. SAP HANA is delivered as a software appliance focused on the underlying infrastructure for SAP Business Objects. This <a href=”http://download.sap.com/download.epd?context=B576F8D167129B337CD171865DFF8973EBDC14E3C34A18AF1CF17ED596163658ABE46C2191175A1415B54F1837F5F0A13487B903339C6F98″>white paper</a> suggests alot of scoping is centred around hardware and infrastructure design.

HANA makes incredulous claims that traditional BI/DW folks would falter to whisper. The one that stands out is the “Combination of OLAP and OLTP” into the one database. Ouch! Feel the wrath of the stakeholders of business operations. Another claim is running analytics in “mixed operations”. Double ouch!

It’s already challenging enough to get DW/BI solutions deployed without affecting operations. BI folks have constantly advocated separate infrastructure for analytics, with the ETL window  as the firewall between systems. The same ETL window has also created delays for realtime analytics. To advocate moving the BI/DW infrastructure back into operations is going to be a challenge. Yes, it facilitates “closer to real-time”, but its going to be a challenge to make it work politically.

For other BI/DW vendors, this solution would be unfeasible, but because SAP also happens to the largest ERP application platform on the planet, they definitely have a good shot at consolidating their ERP and HANA’s BI analytics. Google, Facebook and the large online behemoths already do it. So why not?!

This is indeed exciting, and its definitely time to take a closer look at SAP HANA.

&nbsp;

&nbsp;