UbuntuOneFilesNotes11.10
Introduction
In this tutorial, you will build a simple script to interact with files stored in the Ubuntu One cloud. This won't be a fancy graphical application, just a simple command line Python script to show you how it's done.
You'll need to be using Ubuntu 11.10 or later.
Basics
First, we need a skeleton of a script, something that can take arguments like "get", "put", or "list" and then do something.
This is a very brain-dead script, it merely complains about not being given an argument or prints out whatever argument it was given. But it's enough to start with. We won't be worrying too much about error handling in this tutorial.
Save it with the name "u1file.py".
See 1.py for a copy of the script at this point.
Logging In
One thing we have to do before we can even talk to the Ubuntu One cloud is log into it. There’s a utility class in ubuntuone.platform.credentials to help with this. It’s designed to be asynchronous, but for our simple purposes, we’ll fake synchronicity by waiting in an event loop while it finishes.
So let's add a new function to our script:
1 #!/usr/bin/python
2
3 import sys
4
5 _login_success = False
6 def login():
7 from gobject import MainLoop
8 from dbus.mainloop.glib import DBusGMainLoop
9 from ubuntuone.platform.credentials import CredentialsManagementTool
10
11 global _login_success
12 _login_success = False
13
14 DBusGMainLoop(set_as_default=True)
15 loop = MainLoop()
16
17 def quit(result):
18 global _login_success
19 loop.quit()
20 if result:
21 _login_success = True
22
23 cd = CredentialsManagementTool()
24 d = cd.login()
25 d.addCallbacks(quit)
26 loop.run()
27 if not _login_success:
28 sys.exit(1)
29
30
31 if len(sys.argv) <= 1:
32 print "Need more arguments"
33 sys.exit(1)
34
35 if sys.argv[1] == "login":
36 login()
Note that if you're using Twisted and its asynchronous support, then you can use CredentialsManagmentTool().login() directly and not worry about this loop trick.
Now we can call our script like so: "python u1file.py login" and we will be prompted by Ubuntu One to login. You'll notice that if you call "python u1file.py login" multiple times you don't get prompted twice. That's because the user's credentials are saved locally.
In order to test logging in again, you'll need to first clear the user's credentials. Let's add a tiny "logout" function to do this for our testing purposes:
1 def logout():
2 from gobject import MainLoop
3 from dbus.mainloop.glib import DBusGMainLoop
4 from ubuntuone.platform.credentials import CredentialsManagementTool
5
6 DBusGMainLoop(set_as_default=True)
7 loop = MainLoop()
8
9 def quit(result):
10 loop.quit()
11
12 cd = CredentialsManagementTool()
13 d = cd.clear_credentials()
14 d.addCallbacks(quit)
15 loop.run()
16
17 if sys.argv[1] == "logout":
18 logout()
See 2.py for the full script at this point.
Creating Volumes
Ubuntu One has a concept called a volume, which is a cloud folder that can be synchronized to the user's devices. There is always a default volume called "Ubuntu One", and there may be other volumes that the user has created.
Let's augment our script to be able to create volumes. New volumes are not synchronized to any devices by default. Trying to create a volume that already exists is not an error (much like logging in above when already logged in wasn't an error).
For this, we're going to actually need to make our first real cloud API call. Which means we will need to sign our HTTP request using OAuth headers. Which would normally be a pain, but there's a convenience call for this in ubuntuone.couch.auth:
This snippet, if added to the rest of your script, will allow you to call "python u1file.py create-volume testing", which will create a new "testing" volume for your use. If you now visit http://one.ubuntu.com/files/ you should be able to see your "testing" volume.
Note that we ensure that we are logged in before calling the cloud API function.
A couple notes about legal volumes. You can create a volume with a path, like "one/two", but you can not create nested volumes (a volume inside another volume). However, volumes can have normal folders inside of them.
See 3.py for the full script at this point.
Uploading Files
Now that we have a fresh volume, let's put something in it. Uploading a file is a two-step process. First, we create the metadata, then we upload the actual contents. Uploading the contents is actually done to a different URL, "files.one.ubuntu.com" to a path returned to you when creating the metadata.
When uploading, you must specify a content type and length. But we'll just automatically detect both.
The response to our initial request to create the file will return a JSON-encoded string. This is a way of encoding data structures in a string. Python has excellent support for it, in the "json" module. You'll see simple uses of it below.
1 def put(local, remote):
2 import json
3 import ubuntuone.couch.auth as auth
4 import mimetypes
5 import urllib
6
7 # Create remote path (which contains volume path)
8 base = "https://one.ubuntu.com/api/file_storage/v1/~/"
9 answer = auth.request(base + urllib.quote(remote),
10 http_method="PUT",
11 request_body='{"kind":"file"}')
12 node = json.loads(answer[1])
13
14 # Read info about local file
15 data = bytearray(open(local, 'rb').read())
16 size = len(data)
17 content_type = mimetypes.guess_type(local)[0]
18 content_type = content_type or 'application/octet-stream'
19 headers = {"Content-Length": str(size),
20 "Content-Type": content_type}
21
22 # Upload content of local file to content_path from original response
23 base = "https://files.one.ubuntu.com"
24 url = base + urllib.quote(node.get('content_path'), safe="/~")
25 auth.request(url, http_method="PUT",
26 headers=headers, request_body=data)
27
28 if sys.argv[1] == "put":
29 login()
30 put(sys.argv[2], sys.argv[3])
So let's upload our own script for testing: "python u1file.py put u1file.py testing/u1file.py". If you now visit http://one.ubuntu.com/files/ you should be able to see your "testing" volume and its one file "u1file.py".
See 4.py for the full script at this point.
Downloading Files
Now that we have files to download, let's try that. Just like uploading, this is a two-step process. First, we ask for the metadata, then we download the actual contents. For the contents, we'll use "files.one.ubuntu.com" again.
1 def get(remote, local):
2 import json
3 import ubuntuone.couch.auth as auth
4 import urllib
5
6 # Request metadata
7 base = "https://one.ubuntu.com/api/file_storage/v1/~/"
8 answer = auth.request(base + urllib.quote(remote))
9 node = json.loads(answer[1])
10
11 # Request content
12 base = "https://files.one.ubuntu.com"
13 url = base + urllib.quote(node.get('content_path'), safe="/~")
14 answer = auth.request(url)
15 f = open(local, 'wb')
16 f.write(answer[1])
17
18 if sys.argv[1] == "get":
19 login()
20 get(sys.argv[2], sys.argv[3])
Try to download the script you uploaded before:
python u1file.py get testing/u1file.py /tmp/u1file.py diff -u u1file.py /tmp/u1file.py
The diff command should not output any differences, to prove that the file survived its round trip.
See 5.py for the full script at this point.
Listing Files
Listing files is very similar in form to requesting metadata about a file (as we did in the get example above). The only difference is that we'll additionally ask for information about folder children.
1 def get_children(path):
2 import json
3 import ubuntuone.couch.auth as auth
4 import urllib
5
6 # Request children metadata
7 base = "https://one.ubuntu.com/api/file_storage/v1/~/"
8 url = base + urllib.quote(path) + "?include_children=true"
9 answer = auth.request(url)
10
11 # Create file list out of json data
12 filelist = []
13 node = json.loads(answer[1])
14 if node.get('has_children') == True:
15 for child in node.get('children'):
16 child_path = urllib.unquote(child.get('path')).lstrip('/')
17 filelist += [child_path]
18 print filelist
19
20 if sys.argv[1] == "list":
21 login()
22 get_children(sys.argv[2])
To test this, try "python u1file.py list testing".
See 6.py for the full script at this point.
Querying Files
Getting specific information on a file is very similar to listing, you just don't bother to ask for children information.
1 def query(path):
2 import json
3 import ubuntuone.couch.auth as auth
4 import urllib
5
6 # Request metadata
7 base = "https://one.ubuntu.com/api/file_storage/v1/~/"
8 url = base + urllib.quote(path)
9 answer = auth.request(url)
10 node = json.loads(answer[1])
11
12 # Print interesting info
13 print 'Size:', node.get('size')
14
15 if sys.argv[1] == "query":
16 login()
17 query(sys.argv[2])
Try this out with "python u1file.py query testing/u1file.py"
See 7.py for the full script at this point.
Deleting Files
Last but not least, it is sometimes useful to delete files. This is probably the easiest example:
Try this out with "python u1file.py delete testing/u1file.py". If you now visit http://one.ubuntu.com/files/ you should be able to see that the file "u1file.py" is no longer in your "testing" volume.
See u1file.py for the full and final version of your script.
Error Handling
One thing I didn't cover at all was error handling. I'll cover it briefly now.
If an error occurs on the cloud side, Ubuntu One will return a status code indicating what kind of error. These status codes are standard HTTP codes.
To check the status, do something like: status = int(answer[0].get('status'))
Status codes in the 200s are success codes. Here are some important error codes:
- 400 is "permission denied"
- 404 is "file not found"
- 503 is "servers busy, please try again in a bit"
- 507 is "out of space"
Sometimes you will also receive a generic 500 status message. This just means some internal error happened. Usually such errors have an Oops ID that you can use to report the problem to the Ubuntu One folks:
oops_id = answer[0].get('x-oops-id')
Conclusion
Hopefully that was useful! If you have any other questions or want to read more, read the official documenation: https://one.ubuntu.com/developer/files/store_files/cloud/
mterry/UbuntuOneFilesNotes11.10 (last edited 2011-09-09 19:00:29 by pool-96-237-177-69)