Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Apple filing shows modular "media center" with unified interface

With the advent new forms of digital media and the various media devices that have cropped up alongside, Apple several years ago began to explore a concept it calls a "multi-media center" that would effectively combine and control many of those emerging media components from a central interface.

According an October 2005 filing published for the first time on Thursday, media components supported by the exemplary system would include any computer readable medium that contains digital data and/or an application that can access digital data, such as a DVD player or iPod.

The "multi-media center" itself could be designed in the form of a complex computer program, such as Apple's Front Row software, that would reside on one or more personal computers. The central-user-interface would then be capable of graphically representing each media-component as a selectable item in a main menu, with user input coming by way of a keyboard, mouse, wireless remote, or similarly capable device like the iPhone.

The simplistic plug-and-play aspect of the media center would hinge heavily on a modular architecture that includes at least one software-based media-module for each of the media-components configured for the system. "A media-module can include or obtain data pertaining to a particular media-component (e.g., user interface menus, lists of digital data in the media-component)," Apple engineer Thomas Madden wrote in the filing. "In addition, a media-module can also identify media-player(s) and access information related to their media (e.g., music or movie lists)."

However, the filing notes that media-modules wold not directly control output. Instead, a module-controller communicates with various media-modules and effectively controls output generated in response to user input. "The module-controller can forward the input to various media-modules for processing and receive output from them," Madden explained. "Subsequently, the module-controller can use the output generated by the media-modules to perform the appropriate response (e.g., manipulate display of menus or presentation of media)."


A flow diagram of the proposed media center

In other words, the media-modules do not directly control the output of the multi-media center even though they may process the input and effectively generate the appropriate response. "Furthermore, media-modules can be isolated from each other," the filing states. "As a result, the media-modules cannot communicate with each other, but can be added or removed dynamically as they do not affect each other or a main (base) program that effectively runs the multi-media center."

In one embodiment of Madden's invention, each media-module would include or could obtain information needed to construct menus for its associated media-component. "It should be noted that media-modules can construct their menus (or submenus) by using an User Interface Library (or library)," he wrote. "More particularly, media-modules can obtain a template or other tools (e.g., metric utilities, windows, views, widgets, sounds) from the User Interface Library (or library). As such, each media-module may select a user interface template (e.g., menus, window) from the User Interface Library (or library) and subsequently fill (or populate it) with the appropriate information (e.g., menu items)."

Additionally, media-modules would be able to identify media-players that can be used to present their media, and access information related to their media, such as a music or movie list. The module-controller would initiate the media-player associated with a media-component and subsequently forward any input associated with presentation of media directly to the media-player for processing. "As a result, the familiar look and feel of media-players can be preserved," Madden said.