Breaking the status quo with SQL 2012 Development
SQL Server 2012 development, like other emerging technologies, is unlikely to be embraced as quickly as it should. Many businesses are content with the status quo. It’s not easy to determine if a technology upgrade will be worth the investment. Owners often fall victim to the “if it ain’t broke, don’t fix it” cliché, insisting that “what we’ve got works.” And that’s true. Outdated technology – even if it’s a strong one like Microsoft SQL Server 2008 – sometimes gets the job done.
Featured white paper:
SQL 2012: Bringing “Big Data” to the Desktop
Teresa Lhotka's latest white paper shows how enterprises can use SQL Server 2012 to harness big data from point of collection to presentation-ready reports.
read more →
But is there a faster path to insight?
Microsoft launched SQL Server 2012 on March 7, and in doing so they have provided those savvy enough to adopt this new technology with a wealth of new features that will enhance long-term profitability.
New Features in Microsoft SQL 2012 Development
Thanks to SQL 2012, database development has never been simpler. Some new features include Project Crescent – a browser-based Silverlight reporting tool; tabular SQL Server Analysis Services; and enhancements to SQL Server Reporting Services data alerts and shared SharePoint service. SQL 2012 developers looking for more in-depth descriptions may be interested in Magenic consultant Joe Stariha’s recent blog post on the Integration Services Catalog in SQL 2012.
Compatibility with Hadoop Databases
SQL 2012 development will also benefit from last autumn's announcement that SQL 2012 will feature full integration with Hadoop databases. Hadoop developer Apache has a long-standing relationship with Microsoft. The latest iteration of the union will benefit both SQL 2012 developers and Hadoop developers, but more importantly, businesses with a need for providing data on a massive scale for masses of users. For example, an enterprise financial institution that is already utilizing SQL 2012 and looking to analyze data from dozens of servers could employ a Hadoop application. Hadoop would break that data apart and distribute it throughout those servers. That level of data management, possibly handling petabytes of information, isn't possible with traditional databases.