We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!
BarbaraHudson writes Business Insider is reporting that despite billions of sign-ups, almost nobody is publicly active on Google+. Analytics and visualization blogger Kevin Anderson studied data compiled by Edward Morbius, who says that just 9% of Google+'s 2.2 billion users actively post public content. "We've got a grand spanking total of 24 profiles out of 7,875 whose 2015 post activity isn't YouTube comments but Google+ posts. That a 0.3% rate of all profile pages, going back to our 2.2 billion profiles. No wonder Dave Besbris (Google+ boss) doesn't want to talk about numbers," Morbius writes. For those interested both his methodology and the scripts used can be found here.
209 comments | 5 days ago
theodp writes Some of the world's leading Data Scientists are on the payrolls of Microsoft, Google, Facebook, Yahoo, and Apple. So, it'd be interesting to get their take on the infographics the tech giants have passed off as diversity data disclosures. Microsoft, for example, reported its workforce is 29% female, which isn't great, but if one takes the trouble to run the numbers on a linked EEO-1 filing snippet (PDF), some things look even worse. For example, only 23.35% of its reported white U.S. employee workforce is female (Microsoft, like Google, footnotes that "Gender data are global, ethnicity data are US only"). And while Google and Facebook blame their companies' lack of diversity on the demographics of U.S. computer science grads, CS grad and nationality breakouts were not provided as part of their diversity disclosures. Also, the EEOC notes that EEO-1 numbers reflect "any individual on the payroll of an employer who is an employee for purposes of the employers withholding of Social Security taxes," further muddying the disclosures of companies relying on imported talent, like H-1B visa dependent Facebook. So, were the diversity disclosure mea culpas less about providing meaningful data for analysis, and more about deflecting criticism and convincing lawmakers there's a need for education and immigration legislation (aka Microsoft's National Talent Strategy) that's in tech's interest?
335 comments | about two weeks ago
mikejuk (1801200) writes A survey of UK schools carried out by Microsoft and Computing at School reveals some worrying statistics that are probably more widely applicable. The survey revealed that (68%) of primary and secondary teachers are concerned that their pupils have a better understanding of computing than they do. Moreover, the pupils reinforced this finding with 47% claiming that their teachers need more training. Again to push the point home, 41% of pupils admitted to regularly helping their teachers with technology. This isn't all due to the teachers being new at the task — 76% had taught computing before the new curriculum was introduced. It seems that switching from an approach that emphasised computer literacy to one that actually wants students to do more difficult things is the reason for the problem.
388 comments | about two weeks ago
helix2301 writes: Google's grip on the Internet search market loosened in December, as the search engine saw its largest drop since 2009. That loss was Yahoo's gain, as the Marissa Mayer-helmed company added almost 2% from November to December to bring its market share back into double digits. Google's lead remains overwhelming, with just more than three-quarters of search, according to SatCounter Global Stats. Microsoft's Bing gained some momentum to take 12.5% of the market. Yahoo now has 10.4%. All other search engines combined to take 1.9%.
155 comments | about two weeks ago
mrspoonsi writes The number of people going to the movies in 2014 in North America slipped to its lowest level in two decades. According to preliminary estimates, roughly 1.26 billion consumers purchased cinema tickets between Jan. 1 and Dec. 31. That's the lowest number since 1.21 billion in 1995. Year-over-year, attendance looks to be off 6 percent from 2013, when admissions clocked in at 1.34 billion. Admissions have fluctuated dramatically over the years, and particularly since the advent of modern-day 3D, which can skew the average ticket price. Movie going in North America hit an all-time high in 2002, when 1.57 billion consumers lined up, thanks in part to Spider-Man, The Lord of the Rings: The Two Towers, Star Wars: Episode II — Attack of the Clones, Harry Potter and the Chamber of Secrets and My Big Fat Greek Wedding.
400 comments | about three weeks ago
An anonymous reader writes: Anthony Ferrara, a developer advocate at Google, has published a blog post with some statistics showing the sorry state of affairs for website security involving PHP. After defining a list of secure and supported versions of PHP, he used data from W3Techs to find a rough comparison between the number of secure installs and the number of insecure or outdated installs. After doing some analysis, Ferrara sets the upper bound on secure installs at 21.71%. He adds, "These numbers are optimistic. That's because we're counting all version numbers that are maintained by a distribution as secure, even though not all installs of that version number are going to be from a distribution. Just because 5.3.3 is maintained by CentOS and Debian doesn't mean that every install of 5.3.3 is maintained. There will be a small percentage of installs that are from-source. Therefore, the real 'secure' number is going to be less than quoted." Ferrara was inspired to dig into the real world stats after another recent discussion of responsible developer practices.
112 comments | about a month ago
An anonymous reader writes: As much as we like complaining, and as much as the big media stations like to focus on the most horrible news of the day, the world is actually becoming a better place. Stephen Pinker and Andrew Mack have an article in Slate going through many of the statistics for things like homicide rates, child abuse, wars, and even autocracy vs. democracy. They're all trending in the right direction. Maybe not fast, or even fast enough, but it's getting better.
They say, "Too much of our impression of the world comes from a misleading formula of journalistic narration. Reporters give lavish coverage to gun bursts, explosions, and viral videos, oblivious to how representative they are and apparently innocent of the fact that many were contrived as journalist bait. Then come sound bites from "experts" with vested interests in maximizing the impression of mayhem: generals, politicians, security officials, moral activists. The talking heads on cable news filibuster about the event, desperately hoping to avoid dead air. Newspaper columnists instruct their readers on what emotions to feel. There is a better way to understand the world. ... An evidence-based mindset on the state of the world would bring many benefits."
208 comments | about a month ago
KentuckyFC writes Statisticians have long thought it impossible to tell cause and effect apart using observational data. The problem is to take two sets of measurements that are correlated, say X and Y, and to find out if X caused Y or Y caused X. That's straightforward with a controlled experiment in which one variable can be held constant to see how this influences the other. Take for example, a correlation between wind speed and the rotation speed of a wind turbine. Observational data gives no clue about cause and effect but an experiment that holds the wind speed constant while measuring the speed of the turbine, and vice versa, would soon give an answer. But in the last couple of years, statisticians have developed a technique that can tease apart cause and effect from the observational data alone. It is based on the idea that any set of measurements always contain noise. However, the noise in the cause variable can influence the effect but not the other way round. So the noise in the effect dataset is always more complex than the noise in the cause dataset. The new statistical test, known as the additive noise model, is designed to find this asymmetry. Now statisticians have tested the model on 88 sets of cause-and-effect data, ranging from altitude and temperature measurements at German weather stations to the correlation between rent and apartment size in student accommodation.The results suggest that the additive noise model can tease apart cause and effect correctly in up to 80 per cent of the cases (provided there are no confounding factors or selection effects). That's a useful new trick in a statistician's armoury, particularly in areas of science where controlled experiments are expensive, unethical or practically impossible.
137 comments | about a month ago
MTorrice writes The 2008 recession hammered the U.S. auto industry, driving down sales of 2009 models to levels 35% lower than those before the economic slump. A new study has found that because sales of new vehicles slowed, the average age of the U.S. fleet climbed more than expected, increasing the rate of air pollutants released by the fleet.
In 2013, the researchers studied the emissions of more than 68,000 vehicles on the roads in three cities—Los Angeles, Denver, and Tulsa. They calculated the amount of pollution released per kilogram of fuel burned for the 2013 fleet and compared the rates to those that would have occurred if the 2013 fleet had the same age distribution as the prerecession fleet. For the three cities, carbon monoxide emissions were greater by 17 to 29%, hydrocarbons by 9 to 14%, nitrogen oxide emissions by 27 to 30%, and ammonia by 7 to 16%.
176 comments | about a month and a half ago
HughPickens.com writes Jason Kane reports at PBS that emergency treatments delivered in ambulances that offer "Advanced Life Support" for cardiac arrest may be linked to more death, comas and brain damage than those providing "Basic Life Support." "They're taking a lot of time in the field to perform interventions that don't seem to be as effective in that environment," says Prachi Sanghavi. "Of course, these are treatments we know are good in the emergency room, but they've been pushed into the field without really being tested and the field is a much different environment." The study suggests that high-tech equipment and sophisticated treatment techniques may distract from what's most important during cardiac arrest — transporting a critically ill patient to the hospital quickly.
Basic Life Support (BLS) ambulances stick to simpler techniques, like chest compressions, basic defibrillation and hand-pumped ventilation bags to assist with breathing with more emphasis placed on getting the patient to the hospital as soon as possible. Survival rates for out-of-hospital cardiac arrest patients are extremely low regardless of the ambulance type with roughly 90 percent of the 380,000 patients who experience cardiac arrest outside of a hospital each year not surviving to hospital discharge. But researchers found that 90 days after hospitalization, patients treated in BLS ambulances were 50 percent more likely to survive than their counterparts treated with ALS. Not everyone is convinced of the conclusions. "They've done as much as they possibly can with the existing data but I'm not sure that I'm convinced they have solved all of the selection biases," says Judith R. Lave. "I would say that it should be taken as more of an indication that there may be some very significant problems here."
112 comments | about 2 months ago
An anonymous reader writes Nielsen is going to start studying the streaming behavior of online viewers for the first time. Netflix has never released detailed viewership data, but Nielsen says they have developed a way for its rating meters to track shows by identifying their audio. From the article: "Soon Nielsen, the standard-bearer for TV ratings, may change that. The TV ratings company revealed to the Wall Street Journal that it's planning to begin tracking viewership of online video services like Netflix and Amazon Prime Instant Video in December by analyzing the audio of shows that are being streamed. The new ratings will come with a lot of caveats—they won't track mobile devices and won't take into account Netflix's large global reach—but they will provide a sense for the first time which Netflix shows are the most popular. And if the rest of the media world latches onto these new ratings as a standard, Netflix won't be able to ignore them."
55 comments | about 2 months ago
MojoKid writes Last week, NVIDIA offered information regarding its Android Lollipop update for the SHIELD Tablet and also revealed a new game bundle for it. This week, NVIDIA gave members of the press early access to the Lollipop update and it will also be rolling out to the general public sometime later today. Some of the changes are subtle, but others are more significant and definitely give the tablet a different look and feel over the original Android KitKat release. Android Lollipop introduces a new "material design" that further flattens out the look of the OS. Google seems to have taken a more minimalist approach as everything from the keyboard to the settings menus have been cleaned up considerably. Many parts of the interface don't have any markings except for the absolute necessities. While the OS definitely feels more fluid and responsive, the default look isn't always better, depending on your personal view. The app tray for example has a plain, white background which looks kind of jarring if you've using a colorful background. And finding the proper touch points for things like a settings menu or clearing notifications isn't always clear. Performance-wise, NVIDIA's Shield Tablet showed significantly better performance on Lollipop for general compute tasks in benchmarks like Mobile XPRT but lagged behind Kit Kat in graphics performance slightly, which could be attributed to driver optimization.
57 comments | about 2 months ago
HughPickens.com writes: Every year the works of thousands of authors enter the public domain, but only a small percentage of these end up being widely available. So how do organizations such as Project Gutenberg choose which works to focus on? Allen Riddell has developed an algorithm that automatically generates an independent ranking of notable authors for any given year. It is then a simple task to pick the works to focus on or to spot notable omissions from the past. Riddell's approach is to look at what kind of public domain content the world has focused on in the past and then use this as a guide to find content that people are likely to focus on in the future.
Riddell's algorithm begins with the Wikipedia entries of all authors in the English language edition (PDF)—more than a million of them. His algorithm extracts information such as the article length, article age, estimated views per day, time elapsed since last revision, and so on. This produces a "public domain ranking" of all the authors that appear on Wikipedia. For example, the author Virginia Woolf has a ranking of 1,081 out of 1,011,304 while the Italian painter Giuseppe Amisani, who died in the same year as Woolf, has a ranking of 580,363. So Riddell's new ranking clearly suggests that organizations like Project Gutenberg should focus more on digitizing Woolf's work than Amisani's. Of the individuals who died in 1965 and whose work will enter the public domain next January in many parts of the world, the new algorithm picks out TS Eliot as the most highly ranked individual. Others highly ranked include Somerset Maugham, Winston Churchill, and Malcolm X.
55 comments | about 2 months ago
jones_supa writes We all are aware of various chirping and whining sounds that electronics can produce. Modern graphics cards often suffer from these kind of problems in form of coil whine. But how widespread is it really? Hardware Canucks put 50 new graphics cards side-by-side to compare them solely from the perspective of subjective acoustic disturbance. NVIDIA's reference platforms tended to be quite well behaved, just like their board partners' custom designs. The same can't be said about AMD since their reference R9 290X and R9 290 should be avoided if you're at all concerned about squealing or any other odd noise a GPU can make. However the custom Radeon-branded SKUs should usually be a safe choice. While the amount and intensity of coil whine largely seems to boil down to luck of the draw, at least most board partners are quite friendly regarding their return policies concerning it.
111 comments | about 2 months ago
An anonymous reader writes Scientists from Los Alamos National Laboratory have used Wikipedia logs as a data source for forecasting disease spread. The team was able to successfully monitor influenza in the United States, Poland, Japan, and Thailand, dengue fever in Brazil and Thailand, and tuberculosis in China and Thailand. The team was also able to forecast all but one of these, tuberculosis in China, at least 28 days in advance.
61 comments | about 2 months ago
Lucas123 writes Backblaze, which has taken to publishing data on hard drive failure rates in its data center, has just released data from a new study of nearly 40,000 spindles revealing what it said are the top 5 SMART (Self-Monitoring, Analysis and Reporting Technology) values that correlate most closely with impending drive failures. The study also revealed that many SMART values that one would innately consider related to drive failures, actually don't relate it it at all. Gleb Budman, CEO of Backblaze, said the problem is that the industry has created vendor specific values, so that a stat related to one drive and manufacturer may not relate to another. "SMART 1 might seem correlated to drive failure rates, but actually it's more of an indication that different drive vendors are using it themselves for different things," Budman said. "Seagate wants to track something, but only they know what that is. Western Digital uses SMART for something else — neither will tell you what it is."
142 comments | about 2 months ago
KentuckyFC writes During the Chinese New Year earlier this year, some 3.6 billion people traveled across China making it the largest seasonal migration on Earth. These kinds of mass movements have always been hard to study in detail. But the Chinese web services company Baidu has managed it using a mapping app that tracked the location of 200 million smartphone users during the New Year period. The latest analysis of this data shows just how vast this mass migration is. For example, over 2 million people left the Guandong province of China and returned just a few days later--that's equivalent to the entire population of Chicago upping sticks. The work shows how easy it is to track the movement of large numbers of people with current technology--assuming they are willing to allow their data to be used in this way.
48 comments | about 2 months ago
bmahersciwriter writes Citation is the common way that scientists nod to the important and foundational work that preceded their own and the number of times a particular paper is cited is often used as a rough measure of its impact. So what are the most highly cited papers in the past century plus of scientific research? Is it the determination of DNA's structure? The identification of rapid expansion in the Universe? No. The top 100 most cited papers are actually a motley crew of methods, data resources and software tools that through usability, practicality and a little bit of luck have propelled them to the top of an enormous corpus of scientific literature.
81 comments | about 3 months ago
An anonymous reader writes Lenovo is the latest tech company to enter the fitness tracker market with its Smartband SW-B100 device. "It can record calories burnt, steps taken and a user's heartrate, in addition to syncing with a smartphone through an app to provide more complete health data. Users can also customize notifications and reminders on the smartband, and even use it to unlock a Windows PC without typing in the password, according to the product page."
51 comments | about 3 months ago
jones_supa writes: Microsoft has just released Windows 10 TP build 9860. Along with the new release, Microsoft is introducing an interesting cadence option for how quickly you will receive new builds. The "ring progression" goes from development, to testing, to release. By being in the slow cadence, you will get more stable builds, but they will arrive less often. By choosing the fast option, it allows you to receive the build on the same day that it is released. As a quick stats update, to date Microsoft has received over 250,000 pieces of feedback through the Windows Feedback tool, 25,381 community forum posts, and 641 suggestions in the Windows Suggestion Box.
112 comments | about 3 months ago