Posted by Abhishek Kumar | Last Updated: 14-Jan-19
Kaltura is a popular opensource video platform. It uses Apache as a default web server and Nginx for streaming. It is a complete package which is required to host Video streaming platform including streaming, video storage, transcoding, media player and many more. However, using Kaltura on Cloud platform like AWS/Azure/GCP can posses challenge in storage when the amount of Media is very large. The Server storage cost can easily cross $1000 mark if the storage size is in TiB(s).
One of the option to reduce the cost and increase performance is by using aws s3 for storage and CloudFront CDN as a delivery profile. CloudFront can be used as an RTMP or HTTP delivery profile. Here in this example, we will use CloudFront as HTTP Delivery Profile.
Note: This option is tested on the Kaltura 14+ version.
1) S3 Bucket and create "kaltura" folder or your required name. Here we will use "mys3kaltura" for reference.
2) AcessKey and SecretKey: With permission to Read and Write to the "Kaltura" S3 Bucket.
3) CloudFront (push origin) point to s3 bucket.
To configure the CloudFront with S3 as push origin, Please follow this AWS Doc for reference. https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/GettingStarted.html
Once the S3, AccessKey and CloudFront is configured, Login to the Kaltura Admin Console.
Creating Delivery Profile
First We need to create the Delivery Profile.
On the Publisher tab, Choose "Delivery Profile" in the "Profiles" for the selected Publisher.
Create New Delivery Profile by choosing "HTTP" type and select "Create New" Button.
Fill this required information
General Delivery profile name*: mycdnprofile Delivery Status: ACTIVE #Note Replace "mycdnprofile" with your own tag. Delivery Info Delivery profile Type*: HTTP Streamer Type*: HTTP Supported Media Protocols: HTTPS Delivery profile URL*: https://your_cloudfront_url.cloudfront/mys3kaltura #Note here we have used "mys3kaltura" at the end of the with folder name with your folder name inside the s3 bucket.
Create Remote Storage
On the Publisher tab, Choose "Remote Storage" in the "Profiles" for the selected Publisher.
Create New Remote Storage Profile by choosing "Amazon S3" type and select "Create New" Button.
Fill below details
General Remote Storage Name*: "mys3storage" Delivery Status: ACTIVE Delivery Priority: 0 # 0 being the highest priority. Helpful when using multiple remote storage profile Delivery Priority: https://mys3bucket.s3.amazonaws.com Storage Base Directory: mys3bucket/mys3kaltura Path Manager: "External Path" #External Path will store data in the date structure #Kaltura Path will store data in Kaltura default structure. Storage Username*: "AccessKey" Storage Password: "SecretKey" S3 Region: "BucketRegion" Server-Side Encryption(SSE) Type: NONE #If KMS Encryption is enabled you can enable "Server-Side Encryption(SSE) Type" by selecting KMS or AES256 and provide KMS Key. Make sure the AccessKey and SecretKey should have the necessary permission to use the KMS Key. Delivery Details Add a new Delivery details and choose "mycdnprofile" which we created earlier. Save and exit.
Make sure the Remote Storage is status is enabled and set for "automatically". This will automatically upload the content to the s3 bucket.
#NOTE Remote Storage Profile should not be set to "disabled" or "manual Only"
Granting Remote Storage permission for the publisher.
Once the Delivery and Remote Storage profile is created we need to grant the publisher the permission to use the remote storage.
On the Publisher tab, Choose "Configure" in the "Actions" for the selected Publisher.
In the "Remote Storage Policy"
1) Enable "Remote Storage only" option if you plan to use only the remote storage only.
2) Check "Delete exported storage from Kaltura", To delete the media from the server once they are exported to the Remote Storage. T
3) Check " Remote Storage Delivery Priority", if you are using multiple remote storages and wants to use the priority.
In the "Enable/Disable Features:"
Check the below permission to allow the publisher to use the remote storage
1) Content Ingestion - Ingestion from Remote Storage
2) Remote Storage
Save and Exit.
Once Done, the remote storage will be available for the use. Upload content will be saved to the S3 bucket and will be served using the CloudFront.
To confirm if the media is being served from the CloudFront, follow any one of the steps
1) Play the content from the Kaltura player and check using the sniffer of your choice.
2) Go to the Kaltura Admin console -> Batch Process Control -> Entry Investigation -> select "By Entry ID" -> Enter the entry ID of your uploaded content -> Select "Search".
Below "Entry General Info", The links in "www", "cdn" and "raw" should redirect to the CloudFront URL.