The future of Git is bright. It displaces probably every Code-Versioning System (CVS) system out there, and its open sourced (GPL2)

So what is being displaced by Git? Mercurial svn cvs

So what is great about Git? It’s flexibility, and ability to manage code in any code-management workflow that you can think of. Some examples are:

Local development (for Individuals) Hub-Spoke (for Teams) MegaHub-Hub-Spoke ( for multiple Teams) Spoke-Spoke (for peer-to-peer development) Spoke-Spoke-Hub-Spoke-Spoke (for peer-to-peer and Teams) and the combinations go on…

So what unique concepts drives the unique distributed development capability of Git? These are:

Efficient key-value file storage (http://git-scm.com/book/en/Git-Internals-Git-Objects) Efficient and precise history / log tracking Strong performance even on large repositories

But of the key strengths of Git is the adoption by the linux community. Git is the brainchild of Linus Tovalds (the creator of Linux).

SAP re-launches the <a href=”http://www.sap.com/solutions/technology/in-memory-computing-platform/hana/overview/index.epx”>HANA</a> (<strong>H</strong>igh-performance <strong>AN</strong>alytic <strong>A</strong>ppliance) platform in 2012 and looks to this as the “game changing” technology for BI/DW/analytics. But is it?

Driven by the corporate demand for real time analytics, the HANA platform seeks to put data into memory and dramatically improve performance. This will help address the demand for big data, predictive capabilities, and text-mining capabilities.

But doesn’t this sounds like the typical rhetoric from computing vendors that previously addressed technology issues by recommending the addition of more CPU, or RAM, or disk space. SAP HANA is delivered as a software appliance focused on the underlying infrastructure for SAP Business Objects. This <a href=”http://download.sap.com/download.epd?context=B576F8D167129B337CD171865DFF8973EBDC14E3C34A18AF1CF17ED596163658ABE46C2191175A1415B54F1837F5F0A13487B903339C6F98″>white paper</a> suggests alot of scoping is centred around hardware and infrastructure design.

HANA makes incredulous claims that traditional BI/DW folks would falter to whisper. The one that stands out is the “Combination of OLAP and OLTP” into the one database. Ouch! Feel the wrath of the stakeholders of business operations. Another claim is running analytics in “mixed operations”. Double ouch!

It’s already challenging enough to get DW/BI solutions deployed without affecting operations. BI folks have constantly advocated separate infrastructure for analytics, with the ETL window  as the firewall between systems. The same ETL window has also created delays for realtime analytics. To advocate moving the BI/DW infrastructure back into operations is going to be a challenge. Yes, it facilitates “closer to real-time”, but its going to be a challenge to make it work politically.

For other BI/DW vendors, this solution would be unfeasible, but because SAP also happens to the largest ERP application platform on the planet, they definitely have a good shot at consolidating their ERP and HANA’s BI analytics. Google, Facebook and the large online behemoths already do it. So why not?!

This is indeed exciting, and its definitely time to take a closer look at SAP HANA.

&nbsp;

&nbsp;

If you thought “Big Data” was already quite unmanageable, IEEE predicts a 1500% (x15) growth in data by 2015. That is 3 years from now.

On a similar scale, IEEE also suggests that terabit networks should be implemented soon to cater for demand in network traffic by 2015. This is up by x40-1000 times from today’s gigabit networks.

This probably also suggests that demand for data processing and delivery will need to increase by a similar scale. To some 10-40 times.

What products and skills will power the delivery of services for “Humungous Data”?

New Data systems – like GFS, BigTables, Hadoop, Hive, MapReduce New Data patterns – No-SQL Cloud computing – A must for elastic computing vs BYO data centres Open data systems skills – unless you plan to pay for expensive database licenses. Web Services – to tie it all together Agile Architecture – often under-rated, but is increasingly important to focus corporate development. Agile Security – also under-rated, but is increasingly important.

With corporations already struggling to manage data growth and demand, will this mean a growth of x15 in data staffing, or will a data specialist have to be x15 times more productive. I believe its a combination of both. New tools will make the data professional more effective. At the same time because of the lack of training and skills transfer, there will always be a need for the human bridge.

 

 

The future is indeed exciting.

Kudos to Brittany Wenger from Lakewood Ranch, USA for winning Google’s Science Fair Grand prize.Using a 6-node Artificial Neural Network (see her slides), and alot of cloud computing power, Brittany has managed to train the neural network to detect maligned breast tumors with an accuracy of 99.11%

Now, what is notable is that this girl is 17 years old. I was talking to some parents recently about how the amount of new knowledge being generated today is in the exponential scale. What this means is that they next generation of kids will have to learn more and in less time. Now, I am sure neural networks have been implemented by geniuses far younger than 17 years.

The comparison I would like to make here is that I learnt neural networks at age 20 (and with minimal successful commercial application), and as Britanny has a successful implementation of a neural network at age 17, I would now say that:

My kids will probably be implementing neural networks at age 14-15 Artificial intelligence is going to be more commonplace in the future.

PHP 5.4

Jun 26, 2012

by

In:Development, News, Software

Comments Off on PHP 5.4

Here are some things you need to know about PHP 5.4.

1) Trait Support

Improvements to single inheritance for classes.

2) Array Improvements

Array manipulation has improved by introduction of array de-referencing. This removes the need to define temporary variables. Code looks neater. Now you can do:

$food = explode(",", "pizzahut,burgerking,kfc,mcdonalds"); $food[3]; // mcdonalds echo explode(",", "pizzahut,burgerking,kfc,mcdonalds")[3]; //mcdonalds 3) $this support in Closures 4) Build-in CLI Web Server

A command line web-server is included in the php interpreter. This is invoked by: $ cd ~/public_html $ php –S localhost:8000 5) <?= support

The short_open_tag setting in php.ini allows for the use of <?= by default in PHP 5.4. Code looks neater. Example:

Previously we used: <?php echo $variable ?>

Now we can use: <?= $variable ?>

 

Read more

The Open Web Application Security Project (OWASP) has opened a chapter in Canberra. Kicked off by Andrew Muller of Ionize, OWASP brings to Canberra expertise in web application security. It also brings the small community of security professionals to meet, discuss and engage in the crucial business of securing applications.

OWASP Canberra is committed to monthly meetings, and the occasional “special” meeting. See you there!

OWASP has a project called ‘The OWASP top ten project‘ which list the top 10 security threats for web-based applications.

OWASP Current Top Twelve Threats

Cross-Site Scripting (XSS) Malicious File execution Insecure Direct Object References Cross-site Request Forgery (spoofing) Information Leakage and Improper Error Handling (I’m guilty) Injections Flaws Broken authentication and session management Insecure cryptographic storage Transport Layer Protection (TLP) Failure to secure URL access (I’m guilty) Security Misconfiguration Unvalidated Redirects and Forwards

Ok, which ones are you guilty of?

Agile Methods has gone through a roller-coaster ride of adoption. The first thing a team member notices about agile are:

Regular scrums – usually daily Micro-Issue tracking Measurements

There are of course many other aspects to Agile methods and Evan Leybourn @ The Agile Director has alot of experience in implementing Agile methods in software development teams. He has a few courses running in Canberra and Sydney. Check them out:

 

Sydney – 2-3 April12 – Agile Methods Sydney – 4 April12 – Advance Agile Methods Canberra – 10-11 April12 – Agile Methods Canberra 12 April12 – Advance Agile Methods

 

Clouds are gaining in popularity. The demand for data, analytics, and forecasting has grown significantly, and the future might belong to those who are able to predict it. However, to predict the future requires computing power – Lots of it. And cloud providers, hosting companies, startups, and big-technology companies are looking at providing this.

So what exactly is the cloud, and why will it provide the computing capabilities which has been dominated by super-computers over the last 2 decades. And why will cloud computing succeed where grid computing failed.

In May 17 1999, SETI@Home was release, and it gave the public a glimpse of how inter-connected computers could be leveraged to perform very large tasks. Grid technology was encapsulated in technologies like SunGrid and xGrid, but largely failed to gain traction. The internet was only starting to go mainstream, and computers were still expensive items.

A decade and a bit later in 2012, Cloud-computing is making headlines, and it seems that cloud-computing may succeed where grid computing failed. So what has changed since 1999?

Computers are cheaper The Internet is much faster VMWare and Virtualisation is making inroads into organisations Hosting and Infrastructure companies are virtualising Accessing virtual services like email, social media, SaaS is common place. Increased awareness of online computing via Amazon Web Services, SalesCloud, Azure.

So will super-computing be replaced? Will there be reduced demand in running parallel jobs on multiple computing nodes? NO. There is significantly increased demand in running computer-intensive and parallel jobs. However the way in how a super-computer might be implemented will change. Instead of proprietary platforms, super-computer will evolve to open-platforms and be built on the cloud. The proprietary bits of super-computing will be the charging mechanisms for the utility.

Will grid computing be replaced? Grid computing will fade away. Grid computing addresses the same type of distributed super-computing that cloud computing would replace. The traditional super-computer might still serve a purpose for tightly-coupled applications which are difficult to distribute to the cloud or grid.

Consumers are not interested in a technology, but rather what they can do with it. In Cloud computing, this becomes more apparent with products like:

Database processing Running an algorithm Getting an answer

 

 

OSDC11 was launched today in Canberra. With a small team of volunteers, the conference has managed to pull together some 250 participants, sponsors, and talented speakers for the 8th year running.

See photos here.

#gallery-1 { margin: auto; } #gallery-1 .gallery-item { float: left; margin-top: 10px; text-align: center; width: 33%; } #gallery-1 img { border: 2px solid #cfcfcf; } #gallery-1 .gallery-caption { margin-left: 0; } /* see gallery_shortcode() in wp-includes/media.php */ OSDC Banner DoD Intelligence & Security Palantir Folks Youngest Geek Geek Grin Ubuntu Swag Google Folks Pascal

Recently, I’ve had the experience of purchasing and attempting to get a NAS device working on my network. The Seagate Black Armor was recommended. My experience with the device unfortunately has not been rewarding.

The Seagate Black Armor has been noted to support up to 4x3TB drives. That is a total of 12TB.

Upon opening the box, setting up the device, and inserting the 4x 3TB Seagate harddrives, the box ran smoothly and the lights lit up in green with “Black Armor” turning up on the LCD screen. The tiny instruction booklet with brief instructions said that the drives should take 8-9 hours to be prepared before the words “Black Armor” turned up on the LCD. It turned up in less than 10 mins.

So maybe it was I had been playing around with the harddisks. I had attempted to get it to work on a Redhat workstation and used fdisk/parted to manipulate the drives partitions.

After querying the LCD screen for the IP address the NAS device had adopted, I open a web browser to view/configure the NAS device. Access to the NAS device via a browser was positive.

A simple login and TADA, the drives were viewable. However the NAS device has not properly accessed the 4x3TB devices. They were just sitting there and creating access volumes was a challenge

Seagates’ advice was to reset the Black Armor device. This is done by clicking on the pinhole at the back of the device. The NAS device lights would turn amber, and the device would finally reboot.