background preloader

Build a Culture around Data for Analytics

Build a Culture around Data for Analytics

Transforming Noise into Signal: Isolating Social Business Results One of the biggest obstacles I see that businesses engaging in social media have today is clearly understanding the effect their efforts — particularly in externally facing processes such as marketing — are actually having. There are a number of reasons for this and together they conspire to create an environment that leads to uncertainty and an inability to claim the fair results of a company’s hard work at social engagement with the marketplace. The first challenge to isolating results is the sheer scale of social conversation today. There are over a billion Facebook users today, hundreds of millions of people on Twitter and LinkedIn alone, and there are thousands of smaller yet often more important social networks, online communities, and other niche, special interest and industry-specific community forums where relevant social activity takes place.

Texas drought maps and photos Various plans for dealing with future droughts and growing demand for water in Texas exist, but most comprehensive — and accepted — is the state Water Plan. It offers a frank assessment of the current landscape, saying Texas “does not and will not have enough water to meet the needs of its people, its businesses, and its agricultural enterprises.” It predicts that “if a drought affected the entire state like it did in the 1950s,” Texas could lose around $116 billion, over a million jobs, and the growing state's population could actually shrink by 1.4 million people.

Is Survival REALLY Survival without Cheese and Homemade Bread? Welcome to this week’s Survive The Coming Collapse newsletter, brought to you by Free Survival Cheat, a set of quick, actionable, and free preparedness and survival tips and tricks from the The Fastest Way To Prepare course. Thanks to everyone who participated in the fundraising sale for victims and first responders helping with Sandy recovery efforts. I’m glad to say that I rounded up the amount raised and sent $1,000 to The Church At The Gateway on Staten Island earlier this week. If you participated, thank you. Multi-core processor Diagram of a generic dual-core processor, with CPU-local level 1 caches, and a shared, on-die level 2 cache. Multi-core processors are widely used across many application domains including general-purpose, embedded, network, digital signal processing (DSP), and graphics. The improvement in performance gained by the use of a multi-core processor depends very much on the software algorithms used and their implementation. In particular, possible gains are limited by the fraction of the software that can be run in parallel simultaneously on multiple cores; this effect is described by Amdahl's law.

42 Big Data Startups – Big Data News Published by Jeff Vance at Startup50. Which ones are missing? I would add Pervasive, Tableau, Splunk, Lavastorm, Yottamine, Alteryx, Pivotal as well as non-product companies. For instance, publishers like DataScienceCentral (self-funded, profitable, with a large list of big data clients). This list contains (too) many Hadoop-related companies. Which companies would you add? How Linked Lifecycle Data can transform your systems engineering environment Background: The web of documents Everyone who has ever used a browser is familiar with the World Wide Web that we've been enjoying for many years. This Web — really a web of documents — has provided a foundation for us to share previously unimaginable amounts of information, yet it has some key implementation details that ultimately impose a limit on its usefulness. The Web represents information as text on pages. It was designed to allow humans to read, filter out redundant information, and infer meaning based on the natural language used, the context of the information, and the existing knowledge of the reader. In other words, we humans glean data from the Web pages that we read.

Big Data means Advanced Data Visualization The last few years have been particularly exciting for data visualizations. We’ve witnessed a boom in the popularity of infographics and in tools to help create everyday visualizations for practical purposes. With all these exciting developments it’s difficult not to wonder what the future of this field will look like. Industry-renowned data visualization expert, Edward Tufte once said “The world is complex, dynamic, multidimensional; the paper is static, flat. How are we to represent the rich visual world of experience and measurement on mere flatland?” What Katrina Can Teach Libraries About Sandy and Other Disasters - Wired Campus Disaster plans used to seem like “kind of a bother” to Lance D. Query, Tulane University’s director of libraries. Then, in 2005, Hurricane Katrina hit New Orleans, flooding Tulane’s Howard-Tilton Memorial Library with more than eight feet of water. “I look at them much more carefully now,” says Mr.

In-database processing In-database processing, sometimes referred to as in-database analytics, refers to the integration of data analytics into data warehousing functionality. Today, many large databases, such as those used for credit card fraud detection and investment bank risk management, use this technology because it provides significant performance improvements over traditional methods.[1] History[edit] Traditional approaches to data analysis require data to be moved out of the database into a separate analytics environment for processing, and then back to the database. (SPSS from IBM are examples of tools that still do this today). Doing the analysis in the database, where the data resides, eliminates the costs, time and security issues associated with the old approach by doing the processing in the data warehouse itself.[2]

Visualization-based data discovery tools Visualization-based data discovery tools may account for less than 5 % of the Business Intelligence (BI) Market, but they are fighting above their weight in terms of profile. In 2011, Gartner placed Visualisation at the peak of the BI Hype Cycle. Despite this indicating the category may lose some of its lustre , Gartner are still predicting a compound annual growth rate of 30% in each of next 5 years. Does Big Data Need Bigger Data Quality and Data Management? By Virginia Prevosto and Peter Marotta Running faster won't get you to the right place if you don't know where you're going. Even if you do know your destination, you need the right road markers to help you on the way. Big Data and, more important, the analytics that Big Data fuels are the technology du jour.

Related:  mdmCDMInformationDeployment