Friday, August 30, 2013

Nieman Journalism Lab

Posted: 30 Aug 2013 09:24 AM PDT



Discussions about the future of news often feature a dystopian, filter-bubbly argument: Let people personalize the news they consume and it'll kill the common conversation essential to democracy.




I've always been suspicious of that argument. (That "common conversation" was always a bit of a charade, and old newspaper monopolies can't be simply willed back into existence by trying to guilt-trip readers who've found options they prefer.)



But whatever your feelings about it, the reality is that very few news organizations have invested in technology that would allow for substantially different presentation of the news from person to person. If my neighbor goes to CNN.com, he'll see the same webpage I do; if my mother goes to LATimes.com, it'll be the same page I see. (Some sites do minor shifts based on geography -- local audiences vs. national audiences, for instance -- but that's still a long way from true personalization.)



The New York Times is one of the few major news organizations that's invested in a that attempts to figure out what stories you, as an individual, might be most interested in. Today, they pushed a set of improvements to that engine, with more to come:



Have you visited the NYT Recommendations page lately? Now powered in real time:



-- Derek Willis (@derekwillis)



The has improved its personalized recommendation engine. Notice any difference?



-- Patrick LaForge, NYT (@palafo)



So what's changed?



Now real-time (no lag); cross-platform; video and slideshows added.



-- Andrew Phelps (@andrewphelps)



Main difference seems to be real time results with more of your reading used to make recommendations.



-- Patrick LaForge, NYT (@palafo)



And yeah, going further back into your reading history for improved recommendations.



-- Andrew Phelps (@andrewphelps)



They're not kidding about the real-time aspect; my recommendations have changed several times in just the past few minutes. (Although it keeps insisting I need to read .)



I've always found the Times' recommendations engine to be quite good, and the early reviews of this edition are solid:



Mine is spot on. RT : The has improved its personalized recommendation engine. Notice any difference?The new recommendation engine suggests a Phillies story as #4. It's the best friend I've ever had



-- Daniel Victor (@bydanielvictor)



The Times is well positioned for an investment in personalization. First off, it produces an enormous amount of content -- hundreds of stories, blog posts, videos, and slideshows each day. If you're limiting your recommendations to recent content, you need to be producing a lot of it every day for a personalization filter to make any sense.



Second, the Times' metered paywall approach -- 10 free articles a month, payment required after that -- means that getting another click can mean a lot more than earning another $0.004 in advertising revenue. The meter incentivizes the paper to do whatever it can to push a marginal reader's story count higher. If you can figure out what a reader wants -- and you can make those recommendations prominent, as the Times does by putting them in the sidebar of article pages -- maybe you can turn a few 8-article-a-month (free) types into 15-article-a-month (paying) types. (In the , recommendations aren't nearly as prominent -- but that could obviously change before the site redesign launches.)



My big question is when recommendations will break out of their shell and become more prominent on the front page of NYTimes.com. They're there now, but several screenfuls down, beneath dozens of other links. When the layout and selection of stories at the top of the front page starts to be influenced by personal recommendations -- when, say, 2 of the 15 articles on the top of my version of NYTimes.com are different from your version -- that'll be a milestone for the algorithm. (Or the death knell of democracy, if you argue the other side.) There's a lot of newspaper tradition arguing against that sort of personalization; I'm hoping the Times can allow itself to benefit a bit more from a pretty powerful tool.



Also, Jacob Harris has an interesting idea:



I also would love if the Recommendation Engine would let me look at what my complete opposite would readUPDATE: This isn't strictly related to the recommendations engine, but I also noticed that the is , the web-automation tool popular among a certain subspecies of nerds. So you can, for example, now get an automatic email when a New York Times Magazine becomes very popular, or save all Well stories to Instapaper, or get an SMS when your company is mentioned in a Times story.



Will IFTTT generate earthshaking pageviews for the Times? Highly doubtful. But it's another case, like the recommendations engine, of the Times (a) differentiating itself from its peers and (b) using technology to make the Times more useful to its readers. Those are wins.



Posted: 30 Aug 2013 09:00 AM PDT



EDITOR'S NOTE: There's a lot of interesting academic research going on in digital media -- but who has time to sift through all those journals and papers?



Our friends at , that's who. JR is a project of the Shorenstein Center on the Press, Politics and Public Policy at the Harvard Kennedy School, and they spend their time examining the new academic literature in media, social science, and other fields, summarizing the high points and giving you a point of entry. Roughly once a month, JR managing editor John Wihbey will sum up for us what's new and fresh.



It's back-to-school time, and recently it seems "school" is coming ever-closer to the media. The that the leading political science blog The Monkey Cage will become part of the Washington Post is the latest sign that academia may play a bigger role in coverage of public affairs.



The hybrid research-communications outfit -- which aggregates university public information articles -- continues to build its audience, while startups and , media platforms for academic voices, are pioneering a new template. Of course, more researchers are joining Twitter every month. A new paper in Journalism Studies, from scholars at CUNY's Baruch College, looks at the possibility of more direct participation by higher education institutions in the creation of investigative and accountability reporting. And after Labor Day, journalism schools across the country will be churning out greater volumes of community news than ever before, as Knight Foundation, Poynter, and many others change and reform in this general direction.



This latest roundup of digital scholarship highlights a growing number of studies -- most brand-new, but a few from earlier this summer -- that have strong implications for the news business and the practice of journalism. Papers in this research area can be highly theoretical, so it's good to see some concrete takeaways offered from academia for a struggling industry. In the coming months, we'll be tracking the most powerful and useful studies that help bridge this gap.



: Study from San Diego State University, published in Journalism & Mass Communication Quarterly. By Amy Schmitz Weiss.



Are news media missing business and growth opportunities by not offering and utilizing more geolocation functionality in their mobile apps? Weiss analyzes more than 100 native apps from top TV network affiliates and radio stations, as well as other news apps in Apple's App Store. She combines that content analysis with results from an online survey of young news consumers, who are increasingly likely to employ geolocation "check-ins" and location-based services as part of their mobile experience.



She finds that the "adoption of geo-located news stories is nonexistent among the traditional media examined. Six apps that did offer geo-located news were mainly user-generated apps." The verdict on news organizations is damning, and the implications are clear: "Legacy news organizations analyzed in this study show that they are failing to keep up with the demand based on what news consumers, particularly young adults, are doing and using on their smartphones. This is supported by the proven hypothesis in this study that found younger adults who use location based services are also likely to consume news on their smartphone."



: Study from NYU Stern School of Business and Harvard Business School, forthcoming in Management Science. By Robert Seamans and Feng Zhu.



It has become conventional wisdom in journalism circles that the loss of the classified ads -- a key part of the newspaper "bundle" that became tragically unbundled as the Internet rose -- was a devastating blow to local news. How devastating? Seamans and Zhu estimate that Craigslist alone cost the business $5 billion over the period 2000-2007.



Of course, setting aside media hand-wringing, this might also be seen purely as good market efficiency -- as $5 billion in net "savings" for classified ad buyers. In any case, the data suggest that "relative to newspapers without classified ad managers, the effect of Craigslist's entry on newspapers with classified ad managers leads to a decrease of 20.7% in classified-ad rates, an increase of 3.3% in subscription prices, a decrease of 4.4% in circulation and a decrease of 3.1% in display-ad rates." (For some contextual perspective, see Pew's , which show that total newspaper classified revenue was about $20 billion annually in 2000, around $13 billion in 2007, and today about $4 billion.)



Seamans and Zhu conclude that the findings help "build an understanding of how media platforms respond to shocks from technologically disruptive entrants from different industries. This issue is important because the boundaries between media industries are blurred today, as advertisers can reach relevant consumers through a variety of channels such as TV, the Internet, and mobile devices. Therefore, platforms are likely to be unprepared for competition if they rely on industry boundaries to identify their competitors."



: Paper from the University of Minnesota and George Washington University, published in Media, Culture & Society. By .



The paper explores and critiques the so-called "hacks and hackers" movement -- the hybrid work being done by journalists and technologists. and (both frequent Nieman Lab contributors) make a series of observations about different facets of this collaboration, and they review the relevant pre-history. But they focus it all around a note of worry: "Because the focus has been on solving problems for journalism, we feel that less attention has been paid to how the larger culture of how open-source software production might inform journalism's broader innovation."



One principle they offer up and explore is "news story as code" -- the notion that news might be endlessly annotated and reshaped by the community. Another idea reviewed is "journalism as knowledge management" -- the journalist as curator of community contributions. Lewis and Usher assert that all of the collaborations and actors involved to date deserve scrutiny, given the numerous inherent challenges, such as the failure to attract true community participation, the realties that projects need leaders, etc.: "[T]hese problems with open source also point to the need to question its aggressive promotion by the likes of Knight, Mozilla, Google, and other institutions seeking to shape the future of journalism and technology. Issues of power, ideology, and control ought to be part of future studies of this emerging connection between the journalism field, tech communities, and open source."



The authors conclude: "[W]e should be careful not to fetishize this concept, or any other, as a panacea, particularly at a time when the latest technology invention is too readily seen as the salvation for journalism's troubled model in the 21st century."



: Study from CUNY, published in Journalism. By .



A paper that blends high theory with empirical, ethnographic research performed in newsrooms and with practitioners of both legacy media and blogs, it attempts to understand the increasingly blurry difference between "original" and aggregated-derivative journalistic work. What actually is "news" these days?



(a regular Nieman Lab contributor) takes as his starting point ideas from the FCC workshop "The Future of Media and Information Needs of Communities: Serving the Public Interest in the Digital Era" (see the ). The views of Steve Waldman, Jeff Jarvis, Paul Starr, and other journo-thinker luminaries are all summarized. Ultimately, Anderson tries to reconcile the competing notions and big arguments about news that emerge. If we can't agree about theory, how about practice? He takes to the streets to do his own journalism and aggregation of the journalists and aggregators. No doubt, the research fieldwork was interesting -- "Sitting in a darkened midtown bar that has long been one of the favorite haunts of journalists working for the New York City tabloid New York Post " -- and the author sails off among the likes of Huffington Post (12 aggregation sites in all) and select mainstream media institutions.



Based on these interviews and field observations, conducted between 2008 and 2011, Anderson notes: "If we remain on the level of rhetorical conflict, of course, the lines are clear enough, but the minute we descend into the realm of material practice all manner of complications ensue [I]t is hard to say which 'occupational group' engages in which jurisdictional practices, given the evidence that aggregation is a radically hybrid form of newswork that promiscuously crosses occupational boundaries."



And Anderson ties it all together with a striking, and very useful, set of fundamental distinctions: "[A]ggregators have accepted the website and the link, and categories of digital evidence more broadly, as valid items which can be rationally processed through the news network. Journalists, on the other hand, remain wedded to analog evidence -- quotes, official government sources, first-person observations, analog documents and files -- as the primary raw material out of which they build their stories. In part this relates to material practice, but it also relates to journalistic culture. In terms of expertise and authority, it means that aggregators and reporters have, thus far, built themselves distinct news networks, with different black-boxed objects of evidence and different claims about how material interaction with those objects validates their professional authority."



: Study from Ben-Gurion University of the Negev, published in Journalismsocial networks contributed 0.4% of the information in news gathering and Internet use did not exceed 4% throughout the decade. The dominant and most remarkably stable technology (even displaying a slight rise in the news discovery phase) is the telephone."



Reich notes that the findings are consistent with similar studies about journalistic practice in the U.S. and European contexts. (See, for example, a related 2012 study from the University of Georgia, ) The author states that "reporters tend to conservatism, even when expected to display maximal receptivity to innovation -- an observation that may disappoint scholars who correctly envisioned the vast potential of new technologies to release journalists from their restricted role as sources' 'oral relays' or help them adapt to changes in news environments."



: Working paper from the University of Toronto and Michigan State. By Jonathan A. Obar and Andrew Clement.



As the debate rages over NSA surveillance practices of Internet traffic routed through the United States -- and as Silicon Valley worries over its business implications -- this paper underscores the dilemmas even for friends and allies. The authors note that a good deal of Canadian Internet traffic passes through U.S. switching centers or carriers, a phenomenon they call "boomerang routing." In their dataset sample, about one-quarter of the 25,000 traceroutes appear to boomerang through the Unites States. They assert that this means that the traffic is subject to all kinds of U.S. security surveillance laws, in effect negating Canada's sovereignty over its citizens' communications and canceling its ability to control the legal norms to which the Canadian citizenry is subjected. Obar and Clement advocate more investment in public Internet infrastructure, particularly the building of more exchange points, and make a call for greater north-of-the-border digital sovereignty. A related paper, also authored by Clement at the Universty of Toronto, is worth checking out.



: Study from City University London, published in Digital Journalism. By Neil Thurman.



Many newspapers now boast of having much larger total audiences, as new web visitors more than make up for losses in print subscribers. But the numbers about audience "reach" can deceive, Thurman finds, if you take careful stock of the actual amount of time people are spending with news institutions. He analyzes various circulation and Nielsen figures relating to 12 British national newspapers; the analysis breaks down the domestic and overseas audiences. Thurman finds that, as of 2011, a "minimum of 96.7 percent of the time spent with newspaper brands by their domestic audience was in print."



Because online visitors typically visit only extremely briefly, even large numbers of site visitors can produce relatively little overall time with the news product. Even The Guardian was seeing only 7 percent of total time spent with their product from online sources. Further, news appears to be losing out in terms of public attention: "Looking at the results between 2007 and 2011 what is perhaps most concerning for newspaper brands is that for all but one of the titles studied -- The Guardian -- the total number of minutes spent reading by the aggregated UK print and online readerships has declined. Across the 12 titles the average fall was at least 16.05 percent."



Still, overseas web visitors were contributing substantially to the overall online time spent with the news entity: For every hour spent by domestic web visitors, 25 more minutes were added by the audience abroad -- at least among the five selected publications. The author concedes that the data he analyzes is imperfect (he can't account for mobile apps, for example), but the takeaway is as follows: "Although some newspapers might take comfort from their increased popularity, because the online visitors who are driving that increase are being relatively frugal with the time they spend with newspapers' online channels, losses in the time-spent-reading newspapers' print products have not, with the exception of The Guardian, been offset by gains in online time-spent-reading."



: Paper from the University of North Carolina, Chapel Hill, published in American Behavioral Scientist. By Zeynep Tufekci.



Tufekci, also an engaged academic commentator , has previously studied activism dynamics . Her latest study presents case studies in how political activists/citizen journalists -- so-called "microcelebrity activists," which she notes "first came to the forefront of international attention in the Arab Spring" -- gain and maintain audiences in social media space, and how this helps define emerging notions about media attention and political power.



She examines in great detail the case of Bahraini activist and finds important differences between the former media world, when legacy media predominated, and our present moment: "Perhaps the most important difference that flows from these cases is that the 'power-dependency' relationship between media and the social movement actors has been fundamentally altered," Tufekci writes. "The microcelebrity activist is not monopolistically dependent on mass media for attention of broader publics. In fact, some activists have follower networks that rival readership of large newspapers. Furthermore, since the immediate follower network also acts as propagator, the reach of these activists can easily be tens of millions of people in just one or two degrees out of their core social media networks -- and, of course, this kind of reach often also supports mass media appearances, further increasing visibility."



: Study from DePaul University, published in Journalism & Mass Communication Quarterly. By Matthew W. Ragas and Hai Tran.



The paper takes an empirical look at the evolving two-way street of how news coverage can drive online search -- and how online search can also drive media coverage. Ragas and Tran use the Associated Press and Reuters as their representative indicators of news coverage and analyze data from the U.S. Search Intelligence database of Experian Hitwise. The study looks at coverage of President Obama during 2009-2010. Predictably, more AP and Reuters coverage -- particularly negative coverage -- was associated with more online search around Obama. But, interestingly, Ragas and Tran found that "coverage volume was also influenced by search trends, demonstrating an instance of reverse agenda setting with the media seemingly monitoring and taking cues from Internet users. Moreover, the impact of search salience on media salience occurred relatively quickly (starting within a week), while the media-led influence appeared to accumulate over a five-week span."



The findings validate greater media investment in monitoring of the digital space -- they "lend empirical support to recent observations of journalists monitoring, influencing, and reacting to search trends and the rise of the active audience in web environments." For communications and journalism scholars, the study is particularly interesting because it shows that the traditional dynamics of media "agenda setting" -- telling the public what to think about, and how to think about it -- is changing and becoming a more dynamic process.



: Study from the University of Texas at Austin and Pontificia Universidad Catolica de Chile, published in Convergence: The International Journal of Research into New Media Technologies. By Ingrid Bachmann and Homero Gil de Zuniga.



The study contributes to the growing literature on how news consumption habits, particularly of digital media, may contribute to engagement and participation levels in other parts of democratic life. In general, people who consume news tend to participate in civic affairs more than those citizens who don't pay much attention. That much should be obvious, and there's some data to back it up.



But this study -- which analyzes 2008-2009 online survey data from 945 participants -- finds those who prefer digital media are, on average, more civically active (e.g., volunteering or charity work, attending a public hearing or rally, etc.) in comparison to traditional print news consumers: "And this is the case regardless of whether it refers to online or offline means of participation and beyond the effect of demographic factors, social orientations and people's levels of news consumption. These results seem to indicate that the Internet may supply a set of characteristics that print journalism may be unable to provide."



Interesting, but it's certainly not the last word on the subject, as there is also a competing academic research thread that suggests more digital media choice may actually contribute to participatory inequality. (See, for example, the of Princeton's Markus Prior.)



: Study from the University of Washington, Seattle Children's Research Institute and University of Wisconsin-Madison, published in Computers in Human Behavior. By Megan A. Moreno, Jonathan D'Angelo, Lauren E. Kacvinsky, Bradley Kerr, Chong Zhang and Jens Eickhoff.



A profitable back-to-school read for incoming college freshman. Despite the laughs the topic might generate -- "Awesome selfie with Natty Ice!" -- the researchers are dead serious about the public health implications and they focus on the consequences of a widespread problem: The escalation of drinking consumption among late teens who find themselves suddenly free from the shackles of Mom and Dad.



The researchers conduct a comprehensive survey of 338 young persons from two different universities and determine that "over the first year of college, alcohol displays on Facebook dramatically increased in a variety of multimedia formats." There were significant differences over time between students at the two universities studied, suggesting that it is college-specific norms that "may impact both alcohol behaviors as well as what material is socially acceptable to display on Facebook at that school."



The researchers propose that "it is worth considering whether universities should play a role in discouraging displayed alcohol content on Facebook by their students. Students may underestimate the potential implications for employment or future educational opportunities that could be impacted by displayed alcohol content on Facebook."



Photo by used under a Creative Commons license.



Posted: 30 Aug 2013 08:00 AM PDT



WORK ON THE SNOWDEN DOCUMENTS SPREADS: A couple of smaller new stories trickled out this week regarding U.S. NSA surveillance, led by on the U.S.' secret intelligence budget and the revelation that the to cover their costs in complying with its PRISM online data-gathering program. One of those companies has been anonymously fighting a government gag order about a surveillance case, though a inadvertently identified it as Google. A on the Australian news site The Stringer accused Google of being essentially an arm of the U.S. State Department.



Elsewhere, the British newspaper The Independent about a secret U.K. surveillance base in the Middle East, claiming that it was in Edward Snowden's leaked documents. Snowden and The Guardian's Glenn Greenwald that Snowden was The Independent's source, with Greenwald suggesting that the leak actually came from the UK government itself (something The Independent denied).



The Guardian also that it's brought in The New York Times to work with it on the Snowden documents after the U.K. government made the paper destroy its hard copies in that country. BuzzFeed's Ben Smith that The Times is working on a set of stories to co-publish with The Guardian next month. ProPublica also that it's working on the documents, first with The Guardian and most recently with The Times. The Guardian's editors about their work with the documents, and The Times' public editor, Margaret Sullivan, on its reporting on the story.



Critics continued to target the U.S. and U.K. governments for their actions toward the journalists reporting on the leaks, led by the . Free Press' Josh Stearns to fight back, and the British journalism review Press Gazette . Likewise, The Times' David Carr for turning against Greenwald as a fellow journalist, a point . NYU's Jay Rosen that the surveillance state is trying to "throw sand in the gears" of journalism, and that "JOURNALISM ALMOST HAS TO BE BROUGHT CLOSER TO ACTIVISM TO STAND A CHANCE OF PREVAILING IN ITS CURRENT STRUGGLE WITH THE STATE."



JOURNALISTIC INTEGRITY QUESTIONS AT ESPN: A troubling development late last week in the world of sports journalism prompted some further questions about the relationship between journalists and their sources -- not involving the state, but with the sports leagues they cover. PBS' Frontline that ESPN was taking its name off of a documentary on the NFL's response to head injuries that the two groups had been collaborating on. The initial suspicion was that ESPN backed out to avoid incurring the wrath of its most lucrative broadcast partner (it has a $15.2 billion deal to broadcast the NFL's games), and that appeared to be confirmed by a that the NFL pressured ESPN after seeing the doc's trailer.



ESPN that its decision was influenced by the NFL, and said it pulled out because it didn't have any editorial control over the program. The New Republic's Marc Tracy (before the Times piece was published), that the network was probably more over-careful than anything. ESPN ombudsman Robert Lipsyte exactly what happened, but said it was clumsiness at best and a more sinister elevation of profit motive over journalism at worst.



Sports Illustrated's Richard Deitsch that will continue to test the tenuous relationship between the NFL, ESPN's business side, and its journalism operations. The Nation's Dave Zirin , anonymously quoting several veteran ESPN journalists who lamented the extreme power imbalance in favor of business interests over journalism at the network. "IT IS NOT A JOURNALISM COMPANY. IT'S AN ENTERTAINMENT COMPANY," SAID ONE. "WHEN YOU GET IN BED WITH THE DEVIL, SOONER OR LATER YOU START GROWING YOUR OWN HORNS."



ESPN received a torrent of criticism externally as well. The NFL players' union as a "disappointing day for journalism," and Sports Business Journal's John Ourand it was a "huge black eye for ESPN." Awful Announcing's , grad student , and freelance sports journalistall talked about what an ominous indicator this was of ESPN's journalistic integrity. Poynter's Kelly McBride said the episode that "it's incredibly difficult for a news organization to hold its own partners accountable."



Deadspin's Barry Petchesky past examples of ESPN bowing to leagues' pressure, and The New York Times published an in-depth - on ESPN's dominance of the world of college athletics that even led to a mid-2000s Justice Department investigation. The Columbia Journalism Review's Dean Starkman why ESPN's conflicts between business and journalism are even more fundamental than in the rest of the journalism world.



ANOTHER NEWS ORG'S SITE HACKED: The New York Times' website went down for several hours on Tuesday because of a hack by the Syrian Electronic Army, a group sympathetic to the Bashar al-Assad regime there. It was the second time The Times' site had gone down in the past month, and at least the third time a news organization's site had been brought down by the Syrian Electronic Army in the past few months. and have good summaries of the attack.



Poynter's Andrew Beaujon has a of pieces explaining the attack, which hijacked The Times' entries in the Domain Name System and changed the records that allow web browsers to determine what server www.nytimes.com refers too. (It didn't involve any access to The Times' servers.) The Washington Post's and CloudFlare's had the best explanations of what happened, and the Los Angeles Times that the attack started with a phishing email at the paper's domain registrar.



READING ROUNDUP: A few other stories floating around this week:



-- The initial ratings for Al Jazeera America's launch last week were (given its newness and its difficulty getting carriage on cable providers). Public radio program Here & Now talked about to AJAM, and at the Columbia Journalism Review, j-school dean Lawrence Pintak gave a of the new network, calling it solid if unspectacular. Quartz's Todd Woody looked at AJAM's , and Doc Searls the loss of Al Jazeera's live online stream in the U.S. and cable companies' aversion to the web.



-- Directors of the drone journalism programs at two j-schools, the universities of Missouri and Nebraska, reported that they've been . As The Chronicle of Higher Education , the two programs will apply for federal permission to test the drones outdoors, but the decision throws the development of this new journalistic subfield into question. Nebraska's Matt Waite also at his program's blog.



-- Medium, the invite-only publishing platform co-founded by a couple of the founders of Twitter, drew some discussion about its growth and identity over the past week. Bloomberg Businessweek co-founder Evan Williams, and The Atlantic's Alexis Madrigal what exactly Medium is, as did GigaOM's . Medium user Anil Dash .



-- that the tech site All Things D is looking around for a potential buyer as its contract with News Corp subsidiary Dow Jones expires at the end of the year. GigaOM's Mathew Ingram looked at , and Reuters' Felix Salmon Kara Swisher and Walt Mossberg have built up at All Things D.



-- Finally, The Guardian's Stijn Debrouwere he gave regarding news organizations' misuse of data analytics and how they can make them more effective. It's worth a read this weekend.



Photo of football concussion study in progress by used under a Creative Commons license.



Posted: 30 Aug 2013 07:00 AM PDT



The journalism unicorn exists. I've seen one -- even worked with one. Maybe you know the kind: a journalist who's as nimble and dynamic as a reporter as she is with coding.



Yes, journalism unicorns are out there, but they are rare -- so rare that Columbia University has had some trouble attracting qualified candidates to a that offers dual degrees in journalism and computer science.



"There is something about not just being able to think and act like a programmer but also to be able to think and act like a journalist, which is quite demanding," said , director of the Tow Center. "It's an unusual skill set. Newsrooms are crying out for these skills."



So, taking cues from the kinds of programs that help college grads complete prerequisites like organic chemistry before applying to medical school, Columbia is now launching what may be the first program of its kind in the world: a post-baccalaureate program designed to teach students computer science concepts in the context of journalistic practice -- before they enroll in further graduate training. (What is it these days with journalism education for inspiration?)



"Rather than just saying, 'Well, you can take some undergraduate classes and come back to us,' we thought we would introduce a program," said Mark Hansen, director of the and one of the people spearheading the program. The two-semester program, dubbed , is set to welcome its first class in the summer of 2014. (Details like tuition and class size are still being worked out, and applications aren't yet being accepted.)



Columbia for the program, which will teach students data science engineering, statistical concepts, and programming. Those who complete Year Zero aren't guaranteed admission to the Tow Center's dual degree, which may be a good thing. It means that journalists and other professionals at all stages in their careers might be able to benefit from Year Zero. The program isn't limited to journalists or aspiring journalists, either.



"What we found is that journalism is not the only field that's experiencing this [need for computer science expertise]," Hansen told me. "We thought about the paths people might take -- one path is to go onto the dual degree, another path is to go onto another quantitative program, or digital humanities or computational social sciences."



Hansen, whose background is in statistics, says Columbia hired him specifically because the university wants to improve its students' "data-slash-computing-slash-algorithmic acumen." Doing so will theoretically help Columbia meet three of its key journalism goals: remaining one of the best respected journalism programs in the country, serving as a pipeline to the industry, and leading the way among other schools teaching journalism.



Hansen says the need for Year Zero says as much about how journalism is changing as it does about even larger social changes. Where journalists traditionally have used data to answer questions, computational literacy means being able to use data to ask them.



"Data and data technologies are fundamentally changing systems of power in our world," he said. "And journalists need to think clearly about what that all means I think looking at data and thinking about data is becoming, in effect, an act of citizenship."



Photo of "Unicorn Chasers" by used under a Creative Commons license.



You are subscribed to email updates from

To stop receiving these emails, you may . Email delivery powered by Google



Google Inc., 20 West Kinzie, Chicago IL USA 60610
Full Post

No comments:

Post a Comment