Feb 072014
 

I was trying out the new PHP SDK for AWS and relearning the methods. While it works well on AWS I was a bit confused with DHO documentation and AWS docs. I wanted to use the SDK but with DHO.

There are advantages to the new library which you can explore. Some of the tasks we programmed manually are already done for you in the library using latest stuff in PHP 5.3 like composer, guzzle etc libraries and tools

After a couple of trial and error in the values I got things working. Here are the samples in case it helps someone else.

AWS Docs: http://docs.aws.amazon.com/aws-sdk-php/guide/latest/quick-start.html

DHO Docs: http://docs.dreamobjects.net/s3-examples/php2.html

 

require('vendor/autoload.php');

use AwsS3S3Client;
use AwsS3ExceptionS3Exception;
use AwsS3EnumCannedAcl;
// Instantiate the S3 client with your AWS credentials
$s3 = S3Client::factory(array(
    'key'    => 'from DH panel',
    'secret' => 'from DH panel',

    'base_url' => 'https://objects.dreamhost.com',
));

$blist = $s3->listBuckets();
echo "   Buckets belonging to " . $blist['Owner']['ID'] . ":
";
foreach ($blist['Buckets'] as $b) {
    echo "{$b['Name']}	{$b['CreationDate']}
";
}

try {
$result = $s3->putObject(array(
    'Bucket'     => '<mybucket>',
    'Key'        => 'data_from_file.txt',
    'SourceFile' => './test.txt',
    'ACL'        => CannedAcl::PUBLIC_READ,
));
}
catch (Exception $e) {
    echo "There was an error uploading the file. 
";

}

 

Uploading entire directory but new objects only. It’s not perfect when used with DHO. sometimes I get missing folders when trying multiple folders. Start with the class instance as above and the following code shows how to upload differences from the directory to the target. The difference part does work well. Tested with 2000 files and 4 folders only 1 level deep.

 

try {
	$dir = 'folder1/';
$bucket = '<muhbuckit>';
$keyPrefix = 'myprefix/';

$s3->uploadDirectory($dir, $bucket, $keyPrefix, array(
    'params'      => array('ACL' => 'public-read'),
    'concurrency' => 20,
    'debug'       => true
));
}
catch (S3Exception $e) {
    echo "There was an error uploading the file. 
";
}

The concurrency works well on DHO as well. If you encounter issues with setting up the SDK do post and I will try to assist. With composer it’s quite easy but for old timers like me it might take awhile to wrap your head around on how easy it is.  I’ll post more samples as and when I write and use them.

  7 Responses to “DreamObjects with AWSSDkforPHP 2, working examples”

  1. Just another tip for you on using DreamObjects with the AWS SDK for PHP v2. You can create a credential file, with the appropriate permissions set, then instantiate the client connection with it.

    Here’s an example of the credential file.

    array(‘_aws’),
    ‘services’ => array(
    ‘DreamObjects’ => array(
    ‘extends’ => ‘s3’,
    ‘params’ => array(
    ‘key’ => ‘Your_Access_Key’,
    ‘secret’ => ‘Your_Secret_Key’,
    ‘base_url’ => ‘https://objects.dreamhost.com’
    )
    )
    )
    );
    ?>

    Then in your code you load the custom config like this:

    $aws = Aws::factory(‘path/to/credential/file.php’);

    Then use the DreamObjects client:

    $dreamobjects = $aws->get(‘DreamObjects’);

    $blist = $dreamobjects->listBuckets();
    echo ” Buckets belonging to ” . $blist[‘Owner’][‘ID’] . “:
    “;
    foreach ($blist[‘Buckets’] as $b) {
    echo “{$b[‘Name’]} {$b[‘CreationDate’]}
    “;

    Nothing fancy but I think it’s a little cleaner than keeping credentials in my code.

  2. Hi Justin,
    thanks for sharing the interesting bits. A much better example of reusable code. There is much in the new AWS sdk I need to explore. Love the fact that concurrency works pretty well on DHO as well. All I need now is headers. Not sure if I can set expires, content-type etc headers.

    thanks again. Please feel free to share more 🙂

  3. You can set headers and it’s pretty straight forward. Based on your example using putObject, here’s how you’d do it:

    try {
    $result = $dreamobjects->putObject(array(
    ‘Bucket’ => ”,
    ‘Key’ => ‘data_from_file.txt’,
    ‘SourceFile’ => ‘./test.txt’,
    ‘Expires’ => ‘Wed 12 Feb 2014 20:00:00 GMT’,
    ‘ContentType’ => ‘text/plain’,
    ‘CacheControl’ => ‘no-cache’,
    ‘ACL’ => CannedAcl::PUBLIC_READ,
    ‘Metadata’ => array(
    ‘first’ => ‘Justin’,
    ‘last’ => ‘Lund’,
    ‘comp’ => ‘DreamHost’,
    ),
    ));
    }
    catch (Exception $e) {
    echo “There was an error uploading the file.
    “;
    }

    You can do this with CacheControl, ContentDisposition, ContentLanguage, ContentType, ContentEncoding, ContentLength, and even add metadata. Hope this is helpful!

    • Oh! When I RTD the DHO docs it said it wasn’t yet possible. Thank you for this. I must update my code right away.

    • Would you know how to generated pre signed URL using virtualhostname? I have already set this up and it works when I hard code the hostname for public objects. For presigned I cannot find a particular method.

      • I don’t know of a way to generate the pre signed URL using the CNAME so I’d recommend generating the URL then substituting for it. I just did a quick test and this worked for me.

        $bucket = ‘my-bucket’;
        $my_file = ‘filename’;
        $CNAME = ‘sub.domain.com’;

        $signed_url = $dreamobjects->getObjectUrl(
        $bucket,
        $my_file,
        ‘+1 hour’);

        echo str_replace($bucket . ‘.objects.dreamhost.com’, $CNAME, $signed_url);

Leave a Reply