Monday, December 07, 2009

When is someone going to kick these stupid arrogant fuckers from office?

Wednesday, December 26, 2007

This in from Media in Canada, 21 December 2007:
Programming News

Intelligence faces uncertain future

CBC insists it hasn't yet made a decision about Intelligence, and media buyers remain upbeat about Chris Haddock's beleaguered series, despite its weakening ratings. But word from the set is that there won't be a third season of the series.

The crime drama ended its second season with a two-hour finale this month and, according to CBC, attracted an average of 263,000 eyeballs this year, down from 312,000 in 2006/07. The season peaked at 327,000 and drew 315,000 to the closer.

"It's definitely at the low end of our prime-time performers," says CBC spokesman Jeff Keay. "The first season's numbers were relatively low too, but those who loved it, loved it a lot, and so we brought it back for another season on a new night." He adds that it is not yet known if the series will be brought back. "We haven't made a decision yet. That will happen in February."

However, a source close to the project and the production company, who spoke on condition of anonymity, says Haddock's company has notified staff not to expect their contracts to be renewed for season three. The staff believes the series has been dropped.

Neither Chris Haddock, who is working on a possible US version of the show for Fox, nor anyone from Haddock Entertainment was available for comment.

The series debuted Tuesdays at 9 pm, and lost opposite House, then moved this year to the same time on Mondays. But the switch didn't make a difference, and Haddock has recently been at war with the Ceeb in the news media, claiming the net was sabotaging the series by not promoting it. CBC answered that it has a limited promotional budget, and many other shows to push.

Ad buyers, however, still see value in Intelligence. "I would buy it. It's a good quality property," says Hugh Dow, president of M2 Universal in Toronto, though "it depends on the price. At the end of the day, key factors are the price it's being sold at and the demographic it's attracting. It may not have a large audience, but I look at [the series] as a niche market."

Florence Ng, VP broadcast investments at ZenithOptimedia, concurs. "The numbers are low, but it can be a function of a number of things. Promotion plays a part, yes and no. If the show isn't good, if it doesn't appeal to an audience, all the promotion you do won't make a difference," she says. "The longevity of a program is not a function of audience. CBC has a time slot to offer, it does not rely on US suppliers. I do not think they will replace or pull a show based on the numbers."

"We look at four key factors when assessing a show: the public value of the program in our role as public broadcaster, audience size, revenues, and cost," says Keay.

Regardless of the Ceeb's decision, says another Intelligence staffer who also declined to be named, "we know that this is a quality, marketable show."

From Playback Daily

Wednesday, October 31, 2007

What happens when your imported product dries up, and your domestic product has effectively died on the vine?

Media people in the US have been discussing for a few weeks some of the possible negative effects that a WGA strike might have on American television production, and the knock-on effects for broadcasting in the US.

What I'm wondering is what happens to telly in this country... seeing as how the private OTA networks have already abdicated their weekday content and scheduling to the Americans (and collect a pretty penny from Canadian corporate advertisers via simultaneous substitution* ), what happens when that content is no longer there as promised?

Do you, as CTV or Global, simply suck it up from NBC, ABC, CBS, Fox et al and show whatever is replacing your original content? What if another Canadian network has already pledged it? (Crap. I can't find that article I saw online discussing this possibility.)

OK. There are bound to be insurance provisions in the private broadcasters' contracts with the advertisers and agencies to cover this eventuality. But still. You're CTV and Global. You have one major domestic product each, both of them half-hour sitcoms. A drama you were sharing with ABC Family has been put down (Falcon Beach), another one (Whistler) appears to be in limbo, and nothing seems to be in the pipeline.

What do you do?

Do you start raiding your other properties? In this case, cable specialty channels? NBC is currently rumoured to be considering exactly this for Battlestar Galactica, perhaps taking old episodes from the SciFi channel and showing them over the main network. Which may work. However, when they showed the miniseries a few years ago, ratings for the network tanked pretty badly (Unfortunately, as it is my favourite show).

For CTVglobemedia this is now a moot point as they now own all the parts of the puzzle - CityTV shows S2 reruns from Space currently; conceivably they could do the same with the main network ... but what does Global do with replacing Bionic Woman, or Journeyman, or Life?

Do the US networks reimburse Canadian networks at any point?


* I am highly amused to discover that US cable providers do the same to Canadian content shown down south as well: see the article on syndication exclusivity, also via Wikipedia. For more information on the US side of things, see Nikki Finke's Deadline Hollywood.

Monday, October 22, 2007

From ZDNet:

How Linux Is Testing The Limits Of Open Source Development

The community's pushing a breakneck pace to add new kernel features, while struggling to keep up with bug fixes. Slowing down doesn't look like much of an option.

By Charles Babcock, InformationWeek
Oct. 20, 2007
URL: http://www.informationweek.com/story/showArticle.jhtml?articleID=202404635

As the latest release of the Linux kernel emerged this month, it reflected a dizzying number of changes. Kernel 2.6.23, coming just three months after the last update, incorporated business-friendly features, including better virtualization support and an update to the all-important scheduler, as well as the usual new device drivers and bug fixes.

The sheer number of changes coming every two to three months from Linus Torvalds' "code tree" is a sign of accelerating kernel development. The process so far has produced undeniably high-quality, reliable code.

But make no mistake: Torvalds is pushing open source development tactics to new extremes. As the kernel grows in size and complexity, the rapid-fire iterations are straining the capacity of the community of volunteers who test and debug them.

Yet Torvalds can't let off the gas, for two reasons. First, Linux can't afford to fall behind technically, or it'll lose ever-demanding business users. The new kernel, for example, has hooks to take advantage of the latest virtualization capabilities embedded in Intel and Advanced Micro Devices processors. Second, Linux needs to feed its developer community. New features keep coders from getting bored and moving on to other projects, and they attract new talent as coders age or drop out of the process.

The road map of new Linux features, informal and unpredictable as it is, springs from this tension, this constant drive to add features while maintaining quality and stability. Can this 16-year-old open source project be sustained for another 16 years on this scale? "No other open source project has gotten this large or moved this fast," says Dan Frye, an IBM VP who tracks the kernel process. "It's a first-of-a-kind developer community."

Business users depend on this pell-mell process to improve Linux on many fronts beyond virtualization, including power management and security. It can take as long as two years for these rapid kernel changes to find their way into the systems put out by Red Hat and Novell, which most businesses that run Linux use, so there's something of a buffer to the kernel's frantic development process. Still, as the kernel goes, so goes the future of Linux.

SPEED VS. QUALITY
Linux is gaining code at an average of 2,000 lines per day, despite Torvalds' goal of limiting the amount of code that gets into the kernel to keep it as efficient as possible. Linux's modular kernel is the core of the operating system, handling all general-purpose tasks, such as memory management, requests to the CPU, and input/output. It's surrounded by hundreds of add-on packages that do more specialized things, such as translate files between Linux and Windows and configure files for display on an Apache Web server. But the kernel must grow to handle more functionality, more hardware, more users. What started in 1991 as a hobbyist's 10,250 lines of code is now more than 8 million lines.

Some think the kernel, clocked at 86 lines of new code per hour, is exceeding the software development speed limit. A key maintainer, Alan Cox, has warned that some device driver changes should get more testing before being incorporated into the kernel. Andrew Morton, a skilled programmer dubbed "the colonel of the kernel" after Torvalds tapped him as a general manager, has been outspoken on the problem of unfixed bugs in Linux. "I would like to see people spending more time fixing bugs and less time on new features," Morton says. "That's my personal opinion."

But Torvalds indicated at the recent Linux Kernel Summit in Cambridge, England, that he thinks he has erred in the past on the side of caution. Slow kernel releases cause logjams upstream as additions await their chance to get into the kernel. Contributors lose interest without immediate feedback from kernel maintainers and their trusted expert developers. (Torvalds didn't respond to an interview request.)


More bug fixes, fewer new features, Andrew Morton requests.

More bug fixes, fewer new features, Andrew Morton requests
By erring on the side of speeding Linux's development, Torvalds is counting on the basic open source principle that many users testing frequent releases of the code are more likely to catch defects than a more structured testing process. Linux bugs crop up constantly as additions to the kernel are found not to work on certain hardware or to clash with other software, either inside or outside the kernel. Developers who submit code are expected to troubleshoot bugs as they crop up, but often they don't.

At the summit, Morton said he wanted to appoint "a nasty person" to be kernel bugmaster, someone to identify bug sources and "beat up on developers who do not fix bugs," according to kernel developer Jonathan Corbet's account, published by the Linux Foundation. Natalie Protasevich was named bugmaster, and Morton says she has brought more discipline to bug clean-up, even if she falls short of his description of preferred temperament. There were more than 1,500 bugs in the kernel's Bugzilla database; it's down to 1,400.

"It's become a very sophisticated balancing act between rapid development and complete code review," says Dirk Hohndel, who heads Linux and open source technology at Intel. Yet even in its breakneck pace, not every feature that developers want in--or businesses demand--sails right into the kernel.

The process can be frustrating for Linux business customers. At the European travel service Amadeus, Linux is central to its strategy of reducing its infrastructure costs by a factor of 10, says Fred Bessis, VP of technology and strategic planning, by phasing out mainframe systems and running Linux "on cheap hardware." With 10 years of experience using Linux already, it knows what it's getting into, including watching potentially useful new features creep toward business editions.

Holger Weisbrodt, Amadeus' senior systems programmer, says new hardware and drivers get quickly worked into the kernel, but new diagnostic and debugging tools are "taking pretty long to get there." He'd like to see more emphasis put on debugging tools in general.

The latest Linux version shows this unpredictable process at work with two new features, a new scheduler and improved virtualization, that took much different tracks to the kernel, each with its own risks and complications.

THICK SKIN REQUIRED
The process to get a feature approved can be unrestrained and bruising. That was the case with one of the kernel's most important recent gains, the scheduler. The kernel scheduler strives to combine the even-handed, time-sharing characteristics of Unix, so that it can deal with many tasks and users, with the pre-emptive, swift interrupts of a real-time operating system that can respond swiftly to unscheduled events. In commercial operating systems, these tend to be distinct functions. Linux wants to do both.

Contributors have been working on the scheduler for years, but one, Australian doctor Con Kolivas, made a splash in the open source community this summer by airing out, in a widely discussed Australian Personal Computer magazine article, the reasons he quit Linux development in frustration.

His code for kernel 2.6.23, which he dubbed "-ck patchset," was reviewed by Ingo Molnar, a developer employed by Red Hat who has become one of the trusted Linux experts on schedulers, based on previously contributing several schedulers. He found Kolivas' submission wanting when it came to the real-time aspects of scheduling but used it to produce his own version of a multipurpose scheduler. Such borrowing and grafting of other people's code is what the General Public License Linux was meant to encourage, and kernel maintainers try to pay tribute to their sources. Kolivas, who had been getting rejection slips on his proposed code, found the process aggravating.

Kolivas ran into something that can be a barrier to developers. He envisioned different schedulers being used depending on the task. Torvalds and his associates philosophically want basic functions to do things once and do them well, as opposed to generating alternative ways of doing them. That keeps maintenance simpler and interactions between the different kernel subsystems easier to predict. Torvalds imposes the discipline of that architecture, as do participants on the mailing lists where new code gets discussed--where Kolivas' code took a not-uncommon drubbing. "Some of things said on the Linux kernel mailing list [about other developers' code] would probably get you fired at a commercial company," says Joel Berman, who watches the list as Red Hat's product management director.

What emerged in 2.6.23 was the Completely Fair Scheduler, a name that's in part an ironic comment on the need to make trade-offs in a scheduler. Just as Kolivas was displeased, so were those who want better real-time performance. They're hoping improvements on that front will be added next year.

VIRTUALIZATION ON A FAST TRACK
Contrast the years of jostling around the scheduler with the experience of Avi Kivity, an Israeli developer who submitted a large, 12,000-line batch of code called the KVM virtualization engine. It helps to be known to kernel developers and maintainers when submitting a patch, but "KVM came out of the blue," says Morton. "I had never heard of him or his company [Qumranet] before."

Kivity describes himself as a "longtime lurker" on the Linux kernel mailing list, reading it avidly and noting its expert personalities and debates without submitting much code himself. He designed KVM to what he considered kernel standards, kept the kernel's file system expert abreast of progress on the code, and responded immediately to questions and comments from kernel maintainers. KVM addressed an important need in Linux given the rush of interest in virtualization, giving the kernel its first features to exploit the latest virtualization hooks in the Intel and AMD chips. It also artfully made use of the kernel's scheduler and memory manager and affected little else in the operating system. The result was that KVM sailed into the kernel in less than three months after its submission last fall.

Adding code from a little-known author and a fledgling company was a risk, Morton says, since both could fade away, leaving no one with expertise in the code. But given the code's standalone approach, developers could just as simply remove it if it withered.

Even when code like KVM zips into the kernel, there can be a lag of a year or two before it's picked up by one of the top two enterprise distributions, Red Hat Enterprise Linux and SUSE Linux Enterprise Server. ("Community releases" such as Red Hat Fedora and Novell OpenSUSE are updated quickly.) That's to allow for extensive testing and support materials. Many businesses are happy to have that stability and hold off on having the latest and greatest kernel.

And yet Linux races ahead, with developers pouring new features into the kernel in the name of fame, curiosity, and sometimes salary. Over a recent 28-month period in which 11 new kernels appeared, the number of identifiable individual contributors went from 479 to 838. And for every person who gets his or her name on a block of code, there are probably three or four people who helped that person, goes the common estimate. That means 3,000 or so people are involved in each iteration of the kernel.

It's that volunteer community that the Linux movement's still counting on, even as the kernel maintainers, the skilled developers who lead Linux subsystems, are paid by companies such as Google, Hewlett-Packard, IBM, Novell, and Red Hat. That community is why Morton says there's not a "direct trade-off" between speed of development and reliability, since getting features out sooner gets them hammered on long before they'd show up in a business.

Still, there's a drawback, compared with commercial code. "I don't want to call it unpredictability, but you can't guarantee a delivery date," Intel's Hohndel says. "Linux code is delivered when it's ready."

In another two or three months, Torvalds will issue kernel 2.6.24, with a dozen or so new features produced and tested with the help of hundreds of developers who weren't involved in this month's release, with no way to know how much if any of it will eventually make it into business-tested versions. It's not really what anyone would call a product "road map." But so far, it hasn't steered businesses wrong.

Illustration by Dale Stephanos

Continue to the sidebar:
Seven Areas Linux Could Get Better

Sunday, October 21, 2007

From Media Daily News:
Nielsen: TV Strong, But Viewing Rates Slightly Down
by Wayne Friedman, Thursday, Oct 18, 2007 8:00 AM ET
EVEN WITH A FULL YEAR of DVR usage in its calculations, Nielsen Media Research says TV may be strong, but viewing actually dipped for the first time in over 10 years.

In overall viewing time, for viewers 2 years and older, the average daily viewing was 4 hour 34 minutes--down slightly over a year ago by a minute. For just prime-time viewing, the average for viewers 2 years and older is one hour 10 minutes--also down a minute. Household viewing in prime time also dipped by two minutes to 1 hour 52 minutes.

It's the first time TV viewing has dropped since 1996.

Nielsen calculations came from surveying TV viewing for the 2006-2007 season, which included a full seven days of DVR usage after programs' initial airings. In January 2006, DVR penetration into U.S. TV households was 8%. Now, Nielsen calculates DVRs are in 20.5% of TV homes.

In a release, Patricia McDonough, senior vice president of planning policy & analysis at Nielsen Media Research, notes, in part, an explanation. "There are numerous screens competing for time and attention, as well as consumer devices providing new ways for viewers to watch their favorite shows," she says.

However, even with the drop, TV is still at record levels overall, says Nielsen. Looking at total day household viewing data, it notes that levels are at the same place as a year ago: 8 hours and 14 minutes. "These trends demonstrate that tuning to traditional television remains strong," says McDonough.

Saturday, October 20, 2007

From Media Daily News:
On Media
Nobel Mechanism Theory Relevant To Media's Future
by Diane Mermigas, Wednesday, Oct 17, 2007 7:45 AM ET
THIS YEAR'S NOBEL PRIZE IN Economics, awarded to a trio of academicians for their work in "mechanism design theory," has everything to do with the way media-related businesses must reinvent themselves and their financial models for the digital age.

At its core, mechanism design theory encourages systematic thinking about how a new playing field and new rules of play can be modified to improve outcomes. In the nearly 50 years since it was developed, the theory has been broadly applied to business matters as far-ranging as valuing software patents, matching donated kidneys to recipients, calculating taxes, and setting up complex auctions. Application of the theory to the Federal Communications Commission's sale of radio spectrum assured the government the funds it sought, while assuring public and small business access. It has also been applied to the regulation of subscription television and the bundling and pricing of channels.

The initial objectives, per the economists, included the ability "to easily compare different models for selling goods" and provide enough economic incentives to ensure a win-win for all. Much of what the economists devised is relevant to media conundrums. Those include the redefining and re-pricing of advertising; content and services from static to interactive digital platforms; and recalculating the value of advertiser connections from mass eyeballs to individually targeted consumers with whom they can transact online. The social networking, blogging and instant messaging are a means to a commercial end for those hoping to profit from the Internet.

What is needed are new metrics, pricing mechanisms and standards across which values can be transferred in the digital world. We also need a more equitable distribution of income and technical innovation that takes into account public and private goals. These considerations are similar to what drove the economists to analyze economic institutions and the interaction of buyers and sellers, and devise new allocation mechanisms. The Nobel Prize-winning American economists are Leonid Hurwicz, professor emeritus at University of Minnesota; Roger Myerson, a professor at the University of Chicago; and Eric Maskin, a professor at the Institute for Advanced Study in Princeton, N.J.

Many glassy-eyed folks in Hollywood and Silicon Valley, as well as Madison Avenue and Wall Street, would prefer to dismiss such talk as being too complex and irrelevant. But ask executives in those key geographic hubs about their new digital business models, formulas and values, and the usual response is: "We're working on it." To be fair, there are some occasional glimmers of hope.

The mechanism design theory (a branch of game theory, or what psychologists refer to as the theory of social situations) represents only one framework for constructing the new financial infrastructure needed to support media's evolving digital economy. The theory and its application underscore what we need: the creation of broad new frameworks for revaluing, reformulating and reassessing all aspects of media, entertainment and telecommunications in a nascent interactive market. That would include the value of TV and radio stations, advertising spots and page inches, and all forms of content and services. Mostly, it is about how to value, monetize and extend connections between target consumers and marketers, producers and distributors. It is the cog in the digital wheel that moves everything forward. It is required to reap any of the new wealth that's yet to be fully tapped.

For the most part, media-related players have transferred their existing business models, structures and products to new digital platforms, with modest results. There generally is minimal finance-related interactivity involved, other than consumers ordering goods and services online, trading stocks, and banking. Interactive wealth creation has barely begun. Relatively few companies and individuals are researching, constructing, testing and launching truly new business models.

This process must occur while old business models chug along and eventually are replaced. There is historical precedence in the Industrial Revolution and the computer age. Interactive commerce, creativity and communications likewise demand a completely new economic infrastructure to mine and manage digital wealth.

Companies can begin this daunting task by setting up grids to compare the pricing, income and cost of their products or services across the media spectrum. They can profile the existing and projected growth characteristics of each media sector, from print and billboards, to mobile phones and the Internet. They can establish an electronic system for tracking changes in forecasts for key metrics, such as consumer behavior, e-commerce and broadband use. This real-time contextual analysis must be accompanied by other exercises, such as creating models for product creation and services that utilize new technology and phase out legacy mechanics. By pulling more new levers--and eliminating the old ones--a steady and certain digital infrastructure change will occur.

The process of building new economic templates, processes and structures for digital media requires a brand of brainpower, patience and commitment that have become scarce. It would be a mistake not to meet the challenge and allow digital transformation to unfold in undercapitalized bits and pieces. Here are some suggestions for immediate action:

  • Mandate the process as part of corporate 2008 budgets now being planned.

  • Devise a new baseline and definition for fundamental values.

  • Create new formulas for calculating and assigning prices, and play with them.

  • Create new metrics.

  • Experiment with bundling niche services and products in ways facilitated by new technology.

  • Study and cater to consumers' shifting priorities: What they want and how much they will pay.

  • Integrate new models and considerations into existing models.

  • Focus on the digital prize that cannot be attained without making the journey.

Diane Mermigas is editor-at-large at MediaPost. She can be reached at dmermigas@mediapost.com, or 708-352-5849.

Friday, October 19, 2007

From Deadline Hollywood:
Strike Vote In For WGA: 90.3% Say "Yes"
UPDATE: Hollywood writers tonight gave their guild leaders an overwhelming authorization to strike. A walkout could come as soon as November 1st, but I hear that's unlikely as long as progress is being made during the ongoing negotiations with studios and networks. The Writer's Guild Of America announced tonight that the 5,507 ballots cast among its 12,0000 members was the highest turnout in its history -- and underscores the passion and solidarity of the writers to win concessions from the Alliance Of Motion Picture & Television Producers. "It shows an overwhelmingly engaged and activated community of writers who care about this negotiation and support our goals," said WGA West president Patric Verrone, himself an animation writer, in a statement tonight. "Writers do not want to strike, but they are resolute and prepared to take strong, united action to defend our interests. What we must have is a contract that gives us the ability to keep up with the financial success of this ever-expanding industry."

Earlier this evening, I reported that the WGA East authorized the strike by a blowout 90% even though the WGA East was expected to be more anti-strike than the way larger WGA West. "This historic vote sends an unequivocal message to the AMPTP, loud and clear," said WGA east president Michael Winship. "We will not be taken advantage of and we will not be fooled."

Tonight's total WGA tally blew away the producers. Judging from the phone calls I've received from several of the Hollywood moguls, the vote had its intended effect.

As I reported last night, the strike authorization "Yes" vote has OKed a walkout that will bring Hollywood to its knees when it happens. All that was left to be determined was the percentages by which the 12,000-strong Writer's Guild of America members wanted the labor action. These numbers were important to both the WGA and to the studios and networks in order to gauge the strength of the strike fervor. The feeling was always that if the total of the "Yes" votes was anywhere above 75%, the studios and networks had a giant headache on their hands.

I say the moguls shouldn't make the mistake of consoling themselves that most of the writers' ballots were mailed in before AMPTP took that residuals rollback off the bargaining table. Because it wouldn't have mattered: the percentages would have been the same. But that was a huge concession, despite the WGA negotiators spin on it last night, and for the first time it sparked a mood of optimism among the writers that a strike could be averted. Yes, the WGA contract will expire Oct. 31 without a new one in place, but that date is meaningless because the WGA negotiators are not expected to call a strike until weeks after -- in fact, right before New Year's when the timing will be most strategic and the moguls' vacations ruined.

The good news is the WGA and AMPTP are returning to the bargaining table on Monday for a face to face session, which has been rare during these negotiations going on since mid-July. The bad news is if the producers don't understand the degree to which this is a very determined and unified guild in no mood to be pushed around on the dozens of other rollbacks remaining besides residuals. The writers' negotiating team is determined to bring members a very real downloading income stream. Sure, right now no one knows what those revenues will be, but the guild won't move past the difficult New Media issue when the current WGA rallying cry is "Remember the DVDs!"

This WGA team isn't gonna fold like the previous crew led by John Wells because this time around the writers are really in charge, not the hyphenates. (I still marvel at the way Wells ran for WGA president in 1999 and won even though he was a preeminent TV producer, split the Writer's Guild into haves and have nots, then failed in 2001 to stand firm on any of the hard issues so as to ensure no strike would interrupt his own productions. As if that weren't chutzpah enough, shortly after the WGA contract was resolved, Wells quietly informed his West Wing writers that the provisions in their own contracts for increased pay and promotions would not be honored in the series' upcoming 3rd season. And the timing of his move made it almost impossible for them to find new jobs.)

Again, I'll reiterate that this strike can be averted only if the studios and networks decide to. There's no question that the WGA top management and negotiating committee -- Patric Verrone, Dave Young, John Bowman -- outsmarted the moguls who thought there wouldn't be a separate WGA labor action until June when SAG's contract came due. The producers planned primarily for that timing, not for now. So studios found themselves suddenly scrambling to lock down projects and productions they thought had several more months of unfettered development before a walkout. But the networks were really besides themselves at the WGA's earlier-than-anticipated deadline: with so many deals and devlopment done between the start of pilot season in January and the upfronts in May, a WGA walkout by January 1st means network TV is toast. And, at a time when it can least afford it. (Realistically, how many consumers even differentiate anymore between broadscasters and cable? It's all just programming.). And that's not even taking into account how this fall's new shows are underwhelming if not downright tanking.

So far, the WGA has been wily. But they need to stop short of willfull as well, especially with Friday's Dow Jones ending 360 points down, the current credit crunch, and the spectre of further infotainment consolidation (like General Electric selling NBC Universal to Time Warner). As for the AMPTP, this is no time for arrogance. The moguls need to provide an additional income stream that the guild can sell to its members as a major victory. Of course, New Media can still be tweaked and studied well into the future, but even a trickle of new cash flow can transform the WGA's attitude that it's losing too much financial ground to Big Media.

Do it, and do it now.
From Media Daily News:
WPP Reports Weakening Results, Especially For Media
by Joe Mandese, Friday, Oct 19, 2007 8:00 AM ET
WPP GROUP, THE WORLD'S LARGEST buyer of ad-supported media, this morning reported modest third quarter results, reflecting the impact of weakening U.S. dollar currency exchanges, and lagging performance for its media buying and advertising operations. On a bright note, revenues for WPP's brand and identity, healthcare and specialist communications operations soared 16% during the quarter thanks to increasing direct, Internet and interactive media spending in the U.S. and Europe. Revenues for WPP's advertising and media investment management operations rose just 1.1% during the quarter. For the first nine months of 2007, the sector's revenues rose just 0.7%. Overall revenues for the company, the parent of MindShare, Mediaedge:cia, MediaCom, Maxus, and agencies like JWT, Ogilvy & Mather and Y&R, rose 4.9% to $3.033 billion for the third quarter of 2007.

WPP insiders say relatively stronger results for WPP's GroupM unit are masked by its inclusion with other advertising services operations in its division.

Moreover, the division's "like for like" revenues, adjusting for currency differences and other anomalies, shows the advertising and media investment management operations rising 3.7% in the quarter, and 5.2% for the first nine months.

In the near-term, WPP said prospects look better for early 2008, as the effects of quadrennial Olympic and U.S. presidential election cycles begin to have an impact. On a longer term note, the company expressed concerns about the impact of a new U.S. presidency.

"We continue to believe that 2008 will be a good year for the industry, better than 2007, reflecting the positive combined impact of the maxi-quadrennial events of the U.S. presidential election, the 2008 Olympics in Beijing and, on a relatively more modest basis, of the European football championships," WPP stated, adding the caveat, "We also continue to believe that a more important concern should be the impact that any new U.S. administration will have on 2009 - when they have seen the government's books and will be tempted to dispense any politically unpleasant medicine to the electorate, early in the potential eight year political cycle."

While emerging markets continue to play an increasing role in the WPP's fortunes and the overall advertising economy, the company added, "it is still true that when the U.S. sneezes the rest of the world catches a cold."
Now the US may lose its anti-concentration/ownership rules too

I sure hope these guys have something to say about it.

From the New York Times:



Plan Would Ease Limits on Media Owners

By STEPHEN LABATON Published: October 18, 2007

WASHINGTON, Oct. 17 — The head of the Federal Communications Commission has circulated an ambitious plan to relax the decades-old media ownership rules, including repealing a rule that forbids a company to own both a newspaper and a television or radio station in the same city.


Kevin J. Martin, chairman of the commission, wants to repeal the rule in the next two months — a plan that, if successful, would be a big victory for some executives of media conglomerates.


Among them are Samuel Zell, the Chicago investor who is seeking to complete a buyout of the Tribune Company, and Rupert Murdoch, who has lobbied against the rule for years so that he can continue controlling both The New York Post and a Fox television station in New York.


The proposal appears to have the support of a majority of the five commission members, agency officials said, although it is not clear that Mr. Martin would proceed with a sweeping deregulatory approach on a vote of 3 to 2 — something his predecessor tried without success. In interviews on Wednesday, the agency’s two Democratic members raised questions about Mr. Martin’s approach.


Mr. Martin said he was striving to reach a consensus with his fellow commissioners, both on the schedule and on the underlying rule changes, although he would not say whether he would move the measures forward if he were able to muster only three votes.


“We’ve had six hearings around the country already; we’ve done numerous studies; we’ve been collecting data for the last 18 months; and the issues have been pending for years,” Mr. Martin said in an interview. “I think it is an appropriate time to begin a discussion to complete this rule-making and complete these media ownership issues.”


Officials said the commission would consider loosening the restrictions on the number of radio and television stations a company could own in the same city.


Currently, a company can own two television stations in the larger markets only if at least one is not among the four largest stations and if there are at least eight local stations. The rules also limit the number of radio stations that a company can own to no more than eight in each of the largest markets.


The deregulatory proposal is likely to put the agency once again at the center of a debate between the media companies, which view the restrictions as anachronistic, and civil rights, labor, religious and other groups that maintain the government has let media conglomerates grow too large.


As advertising increasingly migrates from newspapers to the Internet, the newspaper industry has undergone a wave of upheaval and consolidation. That has put new pressure on regulators to loosen ownership rules. But deregulation in the media is difficult politically, because many Republican and Democratic lawmakers are concerned about news outlets in their districts being too tightly controlled by too few companies.


In recent months, industry executives had all but abandoned the hope that regulators would try to modify the ownership rules in the waning days of the Bush administration.


“This is a big deal because we have way too much concentration of media ownership in the United States,” Senator Byron L. Dorgan, Democrat of North Dakota, said at a hearing on Wednesday called to examine the digital transition of the television industry.


“If the chairman intends to do something by the end of the year,” Mr. Dorgan added, his voice rising, “then there will be a firestorm of protest and I’m going to be carrying the wood.”


Supporters of the changes say that the rules are outdated and that there is ample empirical evidence to support their repeal. A small number of media companies, including The New York Times Company, are able to own both a newspaper and a radio station in the same city because the cross-ownership restrictions, which went into effect in 1974, were not applied retroactively.


Mr. Martin faces obstacles within the agency to overhauling the rules. One Democrat on the commission, Michael J. Copps, is adamantly opposed to loosening the rules. The other, Jonathan S. Adelstein, has said that the agency first needs to address other media issues, including encouraging improved coverage of local events and greater ownership of stations by companies controlled by women and minorities.


Advisers to Mr. Martin said he hoped to gain the support of at least one of the Democrats, probably Mr. Adelstein, but Mr. Adelstein said in an interview on Wednesday that Mr. Martin’s proposed timetable was “awfully aggressive.”


Three years ago, the commission lost a major court challenge to its last effort, led by Michael K. Powell, its chairman at the time, to relax the media ownership rules. The United States Court of Appeals for the Third Circuit, in Philadelphia, concluded that the commission had failed to adequately justify the new rules. Mr. Martin’s proposal would presumably include new evidence aimed at fending off similar legal challenges.


Mr. Powell’s effort, which had been supported by lobbyists for broadcasters, newspapers and major media conglomerates, provoked a wave of criticism from a broad coalition of opponents. Among them were the National Organization for Women, the National Rifle Association, the Parents Television Council and the United States Conference of Catholic Bishops.


The agency was flooded with nearly three million comments against changing the rules, the most it has ever received in a rule-making process.


Since the appeals court struck down the deregulatory changes, the commission has continued to study the issues at a leisurely pace, and it held a series of hearings around the nation. It had not made any new proposals, and industry executives had not expected the agency to move again so soon.


But in recent days, Mr. Martin has proposed to expedite the rule-making and hold a final vote in December. In part, he has told commission officials, he was reacting to criticism by Mr. Copps about temporary waivers that have allowed companies to own newspapers and stations in the same market.


Mr. Zell has said he wants to complete his $8.2 billion buyout of Tribune Company by the end of the year. Tribune had been granted what were supposed to be temporary waivers to the rule to allow it to control newspapers and television stations in five cities: New York, Chicago, Los Angeles, Hartford and the Miami-Fort Lauderdale area.


Mr. Copps, who for years has waged a campaign against media consolidation, said that it would be hard for the commission to proceed during an election year because media consolidation has provoked deep public skepticism in the past.


He said Mr. Martin’s proposal to complete a relaxation of the rules in December would require procedural shortcuts, giving the public too little time to comment on the proposals and industry experts too little time to weigh their impact on news operations.


“We shouldn’t be doing anything without having a credible process and nothing should be done to get in the way of Congressional oversight and more importantly, public oversight,” Mr. Copps said in a telephone interview from London. “We’ve got to have that public scrutiny. That was one of the big mistakes that Mr. Powell made, and he was taken to the woodshed by the Third Circuit. I fear it is déjà vu all over again.”

Bloodletting at the BBC

  1. 2500 jobs to be cut, program output to decline by 10% [New York Times]
  2. Further 1800 redundancies to be implemented [The Press Association]
  3. Factual programmes to be merged with BBC News, integrating the telly, radio and online operations into a single department [Straight from the horse's mouth @ the BBC's website]
  4. BECTU and NUJ polling for near-instant strike action after BBC DG Mark Thompson notifies key presenters that cuts will be 'as fast as possible'[Media Guardian]
  5. The BBC Trust unanimously approves plans to sell off the Beeb's West London headquarters [Another Guardian Media article]