
Video can be decoded to RGB(A) formats for playback in any existing application, or if an application explicitly requests it, to compressed texture formats for accelerated playback on graphics hardware. The Hap QuickTime codec supports encoding and decoding Hap video.

For general information about Hap, see the Hap project. This is the home of the Hap QuickTime codec. Hap is a video codec for fast decompression on modern graphics hardware. Please use a modern alternative, such as one listed on the Hap video website. All names, logos and other properties mentioned on this site are copyright their respective owners.QuickTime codecs are no longer supported. HAP copyright VIDVOX, LLC 2013-2019, all rights reserved. The HAP codecs email list is used to make occasional announcements. If you have questions or problems, visit the GitHub issues page.

Additional information can be found on the Using HAP page of this website. FFmpeg can be used to encode media to HAP from the command line.The source for the HAP QuickTime Codec.On macOS HapInAVFoundation supports the full encoding process and includes a CLI for transcoding media files.The HAP reference encoder encodes compressed texture data to HAP frames.If you want to perform encoding within your application, you must first of all encode your RGB(A) frames to the appropriate compressed texture format, and then encode that data to HAP frames.Ī number of libraries are available to perform texture compression.

Additional information can be found in the sections below.Įncoding HAP video within your applicationįor many applications, enabling playback is sufficient, and users can be directed to one of the available HAP codecs to encode their media. If you are already using AVFoundation, FFmpeg, LibAV, DirectShow or QuickTime for movie playback or encoding it is relatively easy to add native support for the HAP video codecs to your own applications. The following addons enable HAP support in popular creative coding environments: For more information you can also visit the HAP project page on GitHub. In many cases you can simply use an existing framework or library to enable both playback and encoding. Playback involves retreiving frames from the container (demuxing) and decompressing them to their compressed texture data (decoding), which is then presented directly to OpenGL or Direct3D.īelow you can find information to help add support for HAP into your own software. HAP works by applying a lightweight secondary compressor to standard compressed texture formats.
