Copyright Office Unplugs Aereo’s Cable Claim

Reposted, by Deborah D. McAdams — 07.17.2014 03:40PM

Copyright Office Unplugs Aereo’s Cable Claim

Denies attempt at legal redefinition

WASHINGTON—The U.S. Copyright Office has refused Aereo’s claim that it’s now a cable company and therefore eligible to pay retransmission fees.

“In the view of the Copyright Office, Internet retransmissions of broadcast television fall outside the scope of the Sec. 111 license,” which defines the “limitations on exclusive rights [for] secondary transmissions of broadcast programming by cable.”

Aereo took the U.S. Supreme Court’s June ruling that its resemblance to a cable company meant that it was a cable company for legal purposes. Last week, the company filed comments with the U.S District Court for the Southern District of New York imploring Judge Allison Nathan to find that it had the same status as a cable operator to avoid being shut down by an injunction as the higher court directed. (See “McAdams On: Aereo’s Hail Mary)

Jacqueline C. Charlesworth, general counsel and associate register of copyrights, wrote in a letter to Aereo’s Matthew Calabro that Aereo does not meet the definition of a cable operator as set forth in Sec. 111 of the Communications Act because it’s not regulated by the Federal Communications Commission.

“Sec. 111 is meant to encompass ‘localized retransmission services’ that are ‘regulated as cable systems by the FCC,’” Charlesworth said.

Aereo had sent the Copyright Office 14 account statements covering reporting periods from Jan. 1, 2012 through Dec. 31, 2013, along with $5,310.74 in filing fees. Charlesworth said that since the issue remains before Judge Nathan, the Copyright Office would accept Aereo’s filings on a provisional basis for further review “depending on regulatory or judicial developments.”


Better Pixels to Come – or maybe to Some (?)

Deborah D. McAdams / 07.16.2014 04:08PM

EBU Sets a Path to 4KTV

Better pixels to come

GENEVA, SWITZERLAND—More pixels won’t cut the mustard for the European Broadcasting Union, which wants to see better pixels become a part of the standard for ultra high-definition TV.

“The [technical committee] believes that the current ‘4K ultra HD’ approach of the consumer electronics industry is unsatisfactory and will be of limited success in broadcasting,” the EBU said in its recent policy statement on UHDTV.

The better pixel argument has been circulating among video engineers since the advent of 4KTV. The premise is that higher frame rate, dynamic range, greater color gamut provides more noticeable picture improvement than simply more pixels. There is ongoing debate on whether or not higher resolution alone is even discernible by the average viewer in the average living room.

Televisions being marketed and sold now as “ultra HD” are simply higher res than hi-def, that is, 3,840×2,160 pixels versus 1,920×1,080 or 1,280×720. The EBU cited Display Search projection that 12 percent of TV sets sold next year will be the higher-res 4K sets.

“The EBU Technical Committee believes that the current focus of the [consumer electronics] industry to provide only an increased resolution—4K—and ignoring other enhancements is not a sufficiently large step for the introduction of successful new broadcasting services,” the statement said.

“The DVB Project has specified that a Phase 1 UHDTV broadcast format shall only include the higher resolution and does not take into account other enhanced parameters for ‘better pixels.’ The parameters—or a combination of them—that provide a more immersive viewing experience, such as frame rate, dynamic range, color gamut and enhanced audio are to be considered for a DVB Phase 2 UHDTV broadcast format,” the EBU said.

The EBU noted that YouTube, Netflix and Amazon already could deliver enhanced 4K (with adequate bandwidth). Meanwhile, NHK in Japan is working on delivery 8KTV for the 2020 Olympics.

“The impact of this on the rest of the world is unclear,” the EBU said.

A complete migration to 4K is not expected any time soon, particularly since many operations are not yet capturing and/or transmitting in HD. There’s also the issue of missing pieces, EBU said:

“Mainstream production infrastructures for 4K and UHDTV are still in development…. Many different combinations of parameters are currently under discussion and key interoperability standards are still missing.”

EBU also said better pixels for HD was worth exploring.

“An enhanced, 1080p-based, HD service that includes a certain combination of UHDTV parameters except for the resolution increase, e.g. higher frame rate, higher dynamic range, wider colorimetry and advanced sound system audio, is not yet standardized,” it said.

“Such a 1080p-based HD format could be an appealing option for some broadcasters and should be taken into account in the standardization and investigation process. The EBU proposes that an enhanced 1080p format be developed for broadcasting.”

See EBU Policy Statement on Ultra High Definition Television.”


Amazon Launches New Tiny Cloud VM Instances

July 1, 2014 | Amazon Web Services launched T2, a set of cloud compute instances suited for low-impact applications, such as remote desktops, development environments, small databases and low-traffic web sites. The instances can burst up to higher power if needed through CPU credits.

The feature is yet another attempt to “right-size” Amazon cloud servers to give users confidence that they are using and paying for only the capacity they need. Very often customers will provision enough capacity to handle peak demand periods and pay for it throughout, even though most of that capacity remains unused most of the time.

“In many of these cases, long periods of low CPU utilization are punctuated by bursts of full-throttle, pedal-to-the-floor processing that can consume an entire CPU core,” writes Amazon chief evangelist Jeff Barr on the AWS Blog. “Many of these workloads are cost-sensitive as well.”

He used a car analogy: “Even though the speedometer in my car maxes out at 150 MPH, I rarely drive at that speed (and the top end may be more optimistic than realistic), but it is certainly nice to have the option to do so when the time and the circumstances are right.”

Like a car that rarely tops out, the new instances are for compute workloads with modest demands for continuous compute power that occasionally need more.

They have a “Baseline Performance”, which indicates the percentage of single-core performance of the underlying physical CPU allocated to the instance. Each instance also comes with CPU credits-per-hour, which indicates the rate of credits that the instance receives each hour when the instance doesn’t use its baseline allocation of CPU.

The credits are spent when the instance is active and unused credits are stored for up to 24 hours. The higher the baseline, the more credits the instance accumulates.

A t2.small instance has access to 20 percent of a single core of an Intel Xeon processor running at 2.5 GHz (up to 3.3 GHz in Turbo mode). A t2.medium has access to 40 percent of the performance of a single core, which the operating can use on one or both cores as dictated by demand. The smallest, t2.micro, has 10-percent baseline performance.

Barr noted that the new instances are perfect for business processes that need a burst of CPU power at regular but infrequent intervals and dynamic web sites that received unpredictable bursts of traffic, like some external news drawing a response, getting linked on Reddit (called the “Reddit hug of death”) or inclement weather.

Credits will continue to accumulate if they aren’t used, until they reach the level which represents an entire day’s worth of baseline accumulation. If you’re constantly maxing out on credits, you can switch down to a smaller-size instance.