You are currently browsing the tag archive for the ‘media’ tag.

News stories glue portfolio managers and analysts to their screens. Each story feeds into a positive or negative bias. What if I could automate that? What if I could read everything as it comes out and sort it according to positive and negative news for a company? I could react systematically, quickly and across more stocks. I might even replace the role of the analyst.

Machine-readable news starts with exactly this economic intuition.

Disaffection with existing quantitative trading signals has brought it to investor attention. Advances in linguistic processing and the steady decline in computing costs have made it better and cheaper than before. And a raft of academic and industry papers have guided the way. Read the rest of this entry »

London Review of Books and a startlingly nice piece on the intersection of news, newspapers, paper news, and technology.

And a quote from the editors at The National Review: observations on aggregation.

There isn’t anything inherently wrong with aggregation. On the contrary—unless we expect readers to get all their news from one publication or, alternatively, spend all day sifting through numerous websites themselves—the Web needs aggregators. And smart aggregation does, in fact, add something to the world by bringing a certain editorial judgment to bear on the selection of pieces.

The Editors, The National Review, behind a paywall: TNR

From our first day in business, Bloomberg was making news, with numbers
Mike Bloomberg and a modest ambition

While most news organizations today are listing in the high seas of the digital world, Bloomberg News has proven to be an adventurous and successful competitor. They started with a key asset, the Bloomberg terminal, and a gaping niche – business journalism. As it has grown, it’s become an instrument of recognition for the entire Bloomberg enterprise, a sales tool, and a critical hedge against competition.

Bloomberg seems to have demonstrated that it’s possible to make money from reporting the news. It’s a fierce competitor to the Wall Street Journal, Reuters and other business reporting. It runs a thriving business. But Bloomberg isn’t interested in selling news feeds. Indeed, much of it is given away for free on the web portal. Bloomberg gives it away because it wants to eliminate the profit margin in delivering the news, so it can starve competitors and enhance the value of the terminal. It wants to make the news a commodity.

Bloomberg’s entry into journalism would push traditional news sources to improve their coverage and respond to Bloomberg News. The underlying dataset in the Bloomberg platform gave them a distinct informational advantage over the competition. The information and analytics on financial instruments was just not widely available and not something on which traditional news sources had focused. As Bloomberg says, they were already in the news business – just with numbers. The terminal had become, for example, the de facto source of pricing for US Treasuries and replaced the Federal Reserve’s daily pricing sheet with a Bloomberg terminal at the offices of the AP. Each day, when the AP published the closing Treasury prices, sourced and attributed to Bloomberg, they were effectively running a news story, or an advertisement – take your pick. This unique resource separated them from the competition, gave them pricing power and promoted the terminal – all in one stroke.

Business journalism at the time also lacked the luster of reporting on riots, elections, and wars. Journalism schools didn’t teach business and finance reporting. The mainstream, national press would gloss over financial markets on the way toward bigger stories. As Bloomberg remarks, “Even at the Wall Street Journal, it was rare to find top editors who included among their accomplishments daily stints covering stocks and bonds.” Bloomberg News would enter a seemingly uncontested field. In 1988, Bloomberg marshalled Matt Winkler to enter the fray.

Bloomberg News also provided a much-needed hedge against the possibility of losing key news-suppliers, such as Dow Jones. Bloomberg had already eaten into the Dow Jones Telerate business. While Telerate presented static images of Treasury prices, Bloomberg users were presented with live data on which they could run analytics. When Dow Jones did respond, they pulled the plug on their feeds to Bloomberg, expecting that Bloomberg customers would come back to Telerate and abandon the Bloomberg platform. It turned out that clients found Bloomberg News sufficient: at worst, good enough to get the job done and, at best, invaluable in combination with the underlying dataset. Dow Jones eventually relented six months later and resumed delivering their feeds through the Bloomberg platform. Telerate would later be shut down.

The rapidly growing news enterprise advanced and protected the Bloomberg franchise. It spread the reputation and influence of the Bloomberg organization, and this sold more Bloombergs. More Bloombergs funded more news, and Bloomberg news became increasingly visible beyond the terminal. It worked its way into radio and television first. Then it began traditional print syndication, and syndication brought Bloomberg’s business reporting to the New York Times, among others. With these outlets, the Bloomberg brand became more prominent, more potent. It sold more Bloombergs.

The news division at Bloomberg was never designed to sell the news. It was designed to sell Bloombergs. It started with a market niche and a key asset – business reporting and the terminal. But it rapidly evolved into an important hedge against the risk of key suppliers, such as Dow Jones, cutting off Bloomberg as a customer. When Dow Jones dared to do so, Bloomberg had won. Bloomberg news was good enough to be a substitute or an improvement on most serious business and financial reporting from Reuters, the Wall Street Journal, the FT, the New York Times, and anyone else who might have contact with their customers. Business and financial news reporting, at first an area of distinction for Bloomberg, had become a commodity.

Because Bloomberg doesn’t need to sell the news, those that do are at a disadvantage. They rely on profit margins from distribution, sales and subscriptions to the news. Bloomberg doesn’t. Bloomberg makes money through subscriptions, but they’re subscriptions to the terminal. The news is just another commodity, and it suits Bloomberg just fine to see it have commodity-margins. It just makes the terminal more valuable.

The facts do not owe their origin to an act of authorship.
Justice Sandra Day O’Connor (Feist v Rural Telephone, 1991)

But does the hunt, the research, the interviews? Or perhaps its organization into a story for the dissemination to a reading public? And can these be made exclusive? These questions have bubbled up as the newspaper industry wrestles with what the internet is doing to their business.

The Cleveland Plain Dealer’s Connie Schultz has argued fervently about the rights of authors and their newspapers to capitalize on their product. She came out against “the aggregators” as though they were a malfeasant band of marauders bent on destroying the institution of journalism and by extension democracy. Citing Daniel and David Marburger, she claimed, “parasitic aggregators reprint or rewrite newspaper stories, making the originator redundant and drawing ad revenue away from newspapers at rates the publishers can’t match.”

James Moroney, publisher and CEO of the Dallas Morning News takes a similar approach. He invokes the ‘hot news’ doctrine and asks congress to apply it to the internet. Says Moroney, “perhaps it is time for congress to establish a principle of ‘consent for content’ for breaking news–similar to the ‘hot news’ doctrine recognized by a few states.”

Copyright law sufficed to protect the written word, fixed in a medium, but these claims demand remedy for a larger issue. They aim to protect the investment required to collect the facts and write a story, when it might easily be re-written and distributed by another. But they ask for monopoly control of the story itself — indeed, ownership of the collection of facts and ideas that might make up a breaking story on government corruption, for example. Justice O’Connor, however, finishes with little support for these views: “The distinction is between one of creation and one of discovery.” And discovery is not subject to property rights.

The viewpoints of Moroney, Schultz and the Marburgers have their origin in the nature of print. Print leads to a confusion between controlling the medium and controlling the content – that is, the mistaken idea that breaking a story equates to owning it. The Supreme Court compounded the confusion in 1918 with its decision to augment copyright protection with “quasi-property rights” for the facts and events that make up a news story — the hot news doctrine. It was a legal solution for the disruptive impact of a new technology: newswires. News was paper, and these rights formalized the metaphor. They derived from the physical qualities of the paper, attached property rights to the news and would provide a legal basis from which to make, in this case, the AP’s news exclusive. Theoretically, the AP could then exclude people from learning of it or reprinting it without permission. They wouldn’t just report the news, they would own the news. Read the rest of this entry »

Doc Searls frames danah boyd’s recent talk on privacy at SXSW as a loss of control. The internet’s applications and engagement with society have resulted in a loss of control over one’s privacy. But this is misleading. It suggests that one might have had control in a more personal setting – that an in-person meeting might charter one’s ability to shut another up, impounding the information forever. Did we ever have that level of control? No.

But Searls has tapped into something. It stems from a fundamental disquiet around the social contract that Eben Moglen describes in Freedom in the Cloud with Facebook, among other social networks: “I will give you free web hosting and some PHP doodads and you get spying for free all the time.” He’s tapped into the disquiet around actual control over the architecture of social interactions. It’s control, not the lack thereof, that is startling.

The architecture of social interactions, to be sure, is a loaded phrase. For our purposes, it can be simplified and thought of along three dimensions. First, it reflects the conditions under which one might share information. One might share a status update with Friends on Facebook or tell a colleague of a weekend about town over coffee. Second, it speaks to how another might absorb the information – ranging from listening carefully to surveillance of Friend’s wall on Facebook. Third, it is shaped by how people think about their audience. Are these systems for addressing individuals or groups? A person or a public? boyd’s talk glances upon this, but does not tease out its underlying influence on sharing and surveying.

It would be wrong to say that we ever fully controlled sharing or surveillance in any of the real world interactions of which boyd speaks wistfully. These encapsulate a flexible, mutable set of considerations and circumstances that one might make or be subject to with each interaction. Should I tell so and so? Is this the right place for it? Will someone overhear? What will they do with the information once they have it? We can edit ourselves, choose the conditions of how we share information. But we have to make compromises. We can make judgments around the setting and the person. We might even influence how they treat the information. But we don’t control what they do with that information. We may be careful, but we don’t really control any of it.

Social networks and interactions on the internet, however, introduce actual control over how we share with and survey one another. Control amounts to the easy ability to publicize what boyd calls personally identifiable information and personally embarrassing information. There are two parts to this: the ability to share more effectively and pervasively; and the ability to listen and survey more broadly. It’s not that we are giving up control, as boyd says. We didn’t have it in the first place. It’s that we’re seeing it for the first time. Control is over publicity, not privacy, and it sits with whomever or whatever has the information.

A discrete email might feel selective and appear to impound the information forever, but it can just as easily be circulated to another and another and another. That email or Facebook photo or blog post, unlike hearsay and the slow erosion from memory of a coffee-shop confession with a close acquaintance, can circulate with alarming ease and absolute fidelity to the initial confession. Indeed, systems on the internet don’t so much impart control over privacy, but over publicity. In a matter of keystrokes, damaging, embarrassing or otherwise hilarious information can be shared, surveyed, and shared again – increasingly open to the deliberate or serendipitous surveillance of many more people than might otherwise be intended. Each digital footprint stirs with potential energy.

The rising claim that privacy is dead, boyd suggests, imbues these systems with a prejudice for publicity. Fulfilling Moglen’s social contract, social networks design their systems to increase the velocity of sharing and improve the powers of surveillance. The obvious example comprises the PHP doodads from Moglen’s quote, but the counterpart is how social networks change how people present themselves and the information they share. It’s a change that shifts participation toward publicity.

Social networks orient one’s sharing and surveying toward groups, not individuals. The orientation levels one’s relationships according to various categories of access. One group can see only your public profile. Another, your entire wall and collection of embarrassing photos. But everyone is addressed in the same way through status updates, postings: each according to their clearance, and without regard for who they are individually. A user wrestles with the idea of the public, not the idea of a friendship.

boyd characterizes Twitter accordingly and starts to draw a distinction from Facebook. She suggests Twitter “evolved to be primarily about those seeking an audience and those seeking to follow or contribute to a public in some way.” Users invent a persona and participate in a system designed for publicity. She argues that Facebook, however, is “still fundamentally about communicating with a specific set of people who are, by and large, your friends.” But suburban Facebook’s engineering, through likespostszombies, encourages addressing a group, a public, not an individual — even before the recent changes in privacy policies that accidentally may have led to some over-sharing by unsuspecting users.

Sharing with a public, surveying a public – these activities engage the public. They not only depend on the public, they drive publicity. boyd warns us with a distinction, “there’s a big difference between something being publicly available and being publicized.” But the shift toward control in the architecture of social interactions erases the difference between publicly available information and publicized information. Public information is publicized information.

The shift that we’re observing is one toward greater control, not less. Enhancements to one’s ability to share and survey information introduce massively distributed control and gear the engines of publicity. With each individual arranged as a node in the network, equipped to survey and share as they wish, oriented to an ever changing public, we are seeing a shift toward control, not away. And with it, the realization that more control means more publicity.

Bruce Sanford and Bruce Brown commented in the WSJ on “Google and the Copyright Wars” (11/12). Many are focused on the status of orphan works in the Google Books project, but Sanford and Brown argue that the idea of fair use and its application by search engines is the controversy’s center, not orphan works. Sanford and Brown would say that a search engine’s use of the web’s content is definite and definitely unfair.

Fair use of a book’s content, a website, or even the news underpins a search engine’s ability to find and deliver websites to users of the internet. Sanford and Brown stake out a position for search engines that is similar to a public library. Just as a library can employ the contents of its archive to establish an index for its patrons, the search engine uses the contents of the internet to establish an index for anyone at all. Sanford and Brown, however, contend that search engines are not libraries, so fair use does not apply.

Sanford and Brown argue that two distinctions separate search engines from the library model. Search engines not only copy text, they reproduce it in their results as snippets. Rights of reproduction are protected for copyright holders. Second, search engines sell advertising, and the sale of advertising is contingent on their ability to copy, store and reproduce copyrighted material. These distinctions, argue Sanford and Brown, disqualify search engines from the safe harbor of any exemption made for libraries. Their remedy: legislation.

The problem is, search engines don’t find safe harbor in the library model, and legislation is not the answer. Yes, a library applies fair use in its practices, and search engines have been compared to them in the past, but not all applications of fair use are found in the confines a library. This may be why they are so quick to demand legislation to expand copyright, even though expanding copyright may drive more business to the lawyers who protect it than the websites involved.

The Ninth Circuit court framed a four factor test for fair use in the case of Perfect 10 v. Google, et al in May 2007. The test would distinguish between copyright infringement and fair use in the case of Google’s use of Perfect 10 material in its search results. The four factors comprise: the purpose and character of the use; the nature of the work, ie fact-based or creative; the amount of the work used; and the effect on the market for the work. None of them invoke the metaphor of the library used by Sanford and Brown.

When Google displayed the Perfect 10 images, the Circuit determined that all four factors weigh in its favor. The images may have been highly original, but the results incorporate “an original work into a new work, namely an electronic reference tool,” and this is highly transformative: “a search engine may be more transformative than a parody because a search engine provides an entirely new use for the original work, while a parody typically has the same entertainment purpose as the original work.” Though Google would use a degraded thumbnail version of the image, its “use of the entire photographic image [is] reasonable in light of the purpose of a search engine.” The Ninth Circuit, therefore, reasoned that Google’s use of Perfect 10 thumbnails would be considered fair use. Though it didn’t provide a decision, it did suffice to vacate Perfect 10’s preliminary injunction against Google.

Sanford and Brown mistake the metaphor of a library as the only example of fair use when alternatives, such as the Ninth Circuit’s opinion, are perfectly acceptable. Perhaps this is why, having fleshed out their metaphor, they seize on legislation as a solution. Indeed, they would have Congress assert, “once the cache is monetized for the benefit of a search engine, the line of copyright infringement is crossed.” Isn’t this a sort of Hail Mary pass to rights-holders?
Legislation could make it illegal to monetize a cache without permission, but it’s not the panacea that Sanford and Brown are driving at. If the legislation mandated payments for rights-holders, it would, but this is probably not a suggestion that would be found in the pages of the Wall Street Journal. More likely, it would not, and it would leave websites in the position of the prisoner’s dilemma. If everyone cooperates and insists on payment, it will be to their mutual advantage, but the search engines direct so much traffic that each website has an incentive to break ranks; hence, everyone reluctantly opts in for fear that they’ll be the lone hold-out. In effect, it’s as though the legislation never happened, with one important distinction: there’s a new law on the books that requires a few good lawyers to understand. Perhaps that’s what’s really driving Sanford and Brown’s comment.

There is an exception, however. Not all players are equal in this game. Some may wager that holding-out is viable regardless of legislation or whether others do. That’s exactly what News Corp has done. They have begun negotiating a possible payment from Microsoft for the exclusive right to index their content. Though derided by many on the internet, should they find an agreement, their example will prove an important experiment in the question of paying for content.

Tomorrow, starting tomorrow, we are going to pick Trenton up and we are going to turn it upside down

Chris Christie

The same focus I put on the issues they were concerned about four years ago, I will put on property taxes and auto insurance because those things are too high and we need to get them under control.

Christine Todd Whitman, the last republican governor that focused on property taxes [NYT: JENNIFER PRESTON; Friday, October 24, 1997]

This link has led to the fear that the Whitman tax cut would simply result in a dollar-for-dollar rise in local property taxes, thus negating any savings that taxpayers might realize.

Tim Goodspeed, November 1997, Manhattan Institute Report, considering the link between state income tax, property tax, and school funding. Goodspeed remarks on the potential for Whitman’s income tax cuts to lead to a corresponding increase in property taxes. Because 80% of income tax revenue would go to school districts,  and 20% would go to municipal aid and the homestead rebate, a decline in income taxes could yield a corresponding increase in property taxes to maintain funding across each of these budget items: schools; municipalities; and homestead rebates.

Goodspeed goes on to suggest that the flypaper effect would mitigate increases in property taxes, and the early reports were great. Jim Saxton said, in a report to congress,

A recent study by two economists from the Manhattan Institute, Timothy Goodspeed and Peter Salins, shows that most New Jersey localities did not raise property taxes after the Whitman tax cuts. A few localities raised taxes. On average, for every dollar cut in state income taxes, local taxes rose by only twenty-two cents. A typical household saved over $200 per year in state taxes. Households still witnessed a net tax cut of $156 dollars. The well-being of the New Jersey family is that much better by controlling more of their own resources.

Nonetheless, Goodspeed’s report does assent that “higher income districts…tended to raise their property taxes by more than other districts after the Whitman tax cuts,” which would account for the few localities that raised taxes. As we now know, these were soon followed by increases across the board, belying the flypaper effect.

GOV. CHRISTINE TODD WHITMAN: Yes, property taxes are going up, but that’s a function of local spending. It is not inexorably linked to the income tax, which is what everybody wants to make it seem. When my predecessor raised taxes $2.8 billion and put $1.5 billion directly into the school districts, through the Quality Education Act, property taxes still went up.

MAN ON STREET: The fact is that income tax cut only lowered our income tax by a miniscule amount, and in order to make up for the difference for the school budgets and whatever the townships need, everybody had to get a raise in their property taxes. My property taxes in the township I live in went up 14 percent, which equated to about, uh, $475 this year, because of the fact that she cut our income tax or gave us a reduction.

PBS News Hour: Interview with Whitman and others, November 1996

Real reform is going to require really tough choices at the local level. Our citizens should be asking why New Jersey, the most densely populated state in the country, spends more than any other state to bus a child to school. Citizens should ask, ‘Does New Jersey really need 1,600 separate units of local government?’

Whitman, in a speech to lawmakers on January 26, 1999, intimates that the structural issue behind property taxes resides at the municipal level. Nonetheless, she outlined an aggressive spending plan that did little to lower property taxes aside from introducing rebates. Assembly Speaker Jack Collins, a republican from Salem County, called it “a Christmas budget…Everyone should be happy with it. I think that it touched on every segment of our society: education, crime, the elderly and tax relief. I think it should be getting bipartisan support.” Democrats, on the other hand, remained concerned, and the democratic assembly leader, Joseph Doria of Hudson County observed, “New Jersey residents will still be the most highly burdened taxpayers in the country.” Whitman, however, continued to push the property tax issue from the state to the local level, which meant asking 566 cities and towns, 21 counties, 188 fire districts, and 611 school districts to sit down and sort it out — good luck with that.

Taxes in New Jersey represent 1.74% of a home’s value, compared with the national median of 1 percent. Essex county carries the fifth highest tax burden per a person, nationally. Westchester comes in first, with Putnam County, NY [oddly] coming in at number 10.

Tax Foundation, 2009 report

[The portals] have made assumptions about using our content which are wrong, and we are prepared to demand appropriate compensation.

—Tom Curley, President and CEO of the AP, which now comprises 1400 member newspapers, and former publisher of Gannett’s USA Today: via WSJ

This is about what content providers must do in the digital era. That starts with doing a much better job of protecting the content we create

Tom Curley, AP: via FT

Thomson Reuters and other news agencies have begun working with third-party content identification firms such as Attributor to track the flow of their material across blogs, websites and aggregators. [FT] Any time you talk about a tracking system, the thrust of [the commentary] is about enforcing copyright. But what we hope is the outcome out of this is the ability to enable more licensed uses of  content. We want to keep the content open, we don’t want to keep it behind firewalls.

Jim Kennedy, the AP’s VP of strategic planning: All Things D

What we are building here is a way for good journalism to survive and thrive. The AP news registry will allow our industry to protect its content online, and will assure that we can continue to provide original, independent and authoritative journalism at a time when the world needs it more than ever.

Dean Singleton, chairman of the AP Board of Directors and vice chairman and CEO of Media News Group Inc, Fair Syndication Consortium and Attributor: via World Editors Forum

Nice container. Because they need it to protect such impressive stories as this.

[The portals] have made assumptions about using our content which are wrong, and we are prepared to demand appropriate compensation.
WSJ: Tom Curley, President and CEO of the AP, which now comprises 1400 member newspapers, and former publisher of Gannett’s USA Today

This is about what content providers must do in the digital era. That starts with doing a much better job of protecting the content we create
FT: Tom Curley.

AP, Thomson Reuters and other news agencies have begun working with third-party content identification firms such as Attributor to track the flow of their material across blogs, websites and aggregators. [FT]

Any time you talk about a tracking system, the thrust of [the commentary] is about enforcing copyright. But what we hope is the outcome out of this is the ability to enable more licensed uses of content. We want to keep the content open, we don’t want to keep it behind firewalls.
All Things D: Jim Kennedy, the AP’s VP of strategic planning

What we are building here is a way for good journalism to survive and thrive. The AP news registry will allow our industry to protect its content online, and will assure that we can continue to provide original, independent and authoritative journalism at a time when the world needs it more than ever.
World Editors Forum Dean Singleton, chairman of the AP Board of Directors and vice chairman and CEO of MediaNews Group Inc

Fair Syndication Consortium and Attributor

%d bloggers like this: