I had made mention last week to a couple of friends how I got a few Adobe Air applications up and running on my Ubuntu install. I walked one person through how I did it and decided to also make a quick video tutorial on the process.
A couple disclaimers, first one is that I use my Ubuntu laptop on a daily basis and I have had it up and running for awhile so I am not sure if I installed some supporting packages that make this work. I haven't tried this on a fresh default install of Ubuntu. Second, I was trying to keep the video somewhat short so I didn't get into some of the corkyness that I see in the applications, like on Pownce you don't get the embedded video feeds. Remember, this is Alpha software and it seems like how well the applications are written also factors into how well they are going to run under Linux.
Hope you enjoy the video. Here are the links from the video encase you miss them.
UPDATE: I mentioned in my video how you don't need to save the .air file but can instead just launch it with the "Open With" in the dialog box. This works fine on my Gutsy box but doesn't seem to work on my newer 8.04 Hardy box. Even though Hardy recognizes that the file should be launched using the "Adobe Air Application Installer", I still get the error that "the associated helper application does not exist" when I trying using it. You can however save the file to disk, right click, and select "Open With Adobe Air Application Installer" and it works fine. Not sure why I have this behavior.
UPDATE (2008-Sept-16): Adobe moved the Linux version of AIR to beta (link) and this has fixed a couple of things. First, the issue from my last update doesn't appear to be a problem anymore on Hardy. Second, now AIR Applications have their pretty icons. And finally, and most exciting, Pandora AIR client now works!! 🙂
I went on to explain; Would you rank a "closed source" solution with a great set of easy to use APIs on the same level as a good Open Source solution?
So for example, let's say your company needed a blog. You could go and download any number of Open Source blogs and customize it to your company's needs, or you might spend money on a proprietary blog system that forces you to do things a certain way, or would you rather a closed source (still a proprietary) solution that "out of the box" had you do things a certain way but also offered APIs that allowed you to build you own interface or feed your blog to another solution?
Here are a couple of the great replies I got back:
Having access to the source code is fantastic, since it gives you a great way to - in theory - take matters into your own hands and diagnose any issue, and extend with any feature, you so wish. If you're in a large corporate environment and you need to build in a custom extension to make it acceptable to your users (eg. integrate with your company's proprietary SSO solution) then this can be invaluable.
I think most times the decision will come down to supporting the application and overall cost. Sometimes bigger business needs to have an expert to call when things go south. In smaller businesses, sometimes you wing it.
Each solution has its place. You just have to make that decision in the best interests of your business and your goals.
Its not a question of open or closed and its definitely not a question of Hosted versus SaaS. Its about whether the solution is component based, tailored, scalable and replaceable.
For instance if you opt for a closed solution with really great APIs - that works so long as everything you need to do has been anticipated by the API provider. Unless you use a standardized api (like open-social) or an api that is so dominant that the market provides good adapters to standards (like Facebook ) you will always be dependent upon the development and deployment plan of the closed system you've selected.
I think you actually answered the question yourself, Eric. In my experience, the answer to this always comes down to the two main points you mentioned. Do you NEED the flexibility of an OSS solution? If you have the resources, and the need for true customization, then OSS is always a great option.
Regarding patches and security, a well maintained OSS project with an active community and a wide install base is generally going to be ahead of the curve when it comes to bug fixes, etc.
I think it goes without saying that I am a huge supporter of Open Source but believe it or not I am also a big fan of Bill Gates. You've got to respect what he did for the industry and what he does for the world through his charities. The guy is a geek's version of Michael Jordon. I don't agree with everything the man does obviously but I think the guy has a great mind and he will be missed in the computer industry when he finally moves on.
Anyways, I enjoy watching Bill Gates talk but I kind of caught something interesting in this last interview I was watching.
At about 1.45 Bill talks about how finding the source code for an operating system took him and Paul Allen to the next level on tinkering. I thought to myself how fortunate we were that he had that oppurtunity to look at and study an operating systems source code. It would have been a real shame if the ability to look at source code was deprived from Bill Gates. I wonder how many great thinkers Linux has or will inspire by offering them a similar opportunity.
Firefox 3 browser, 3 millions plus downloads worldwide and growing. Wonder what its like being the only dude in Eritrea to have downloaded it. That guy (or gal lets be fair) deserves a some Firefox swag.
It's amazing to me that I still stumble across sites like this. Especially "real" sites from "real" companies. This is a media company none the less. The funny thing is, running Internet Explore under Windows is one of the last combinations I actually use. Even under Windows, Firefox is my default browser. The show I was trying to check out was called "The IT Crowd". Seeing how I live in the US and don't get "Channle4" chances are the only way I am going to see it is to stream it. Now the reality is I can easily move to one of my Windows boxes or even fire up my virtual Vista desktop on my current desktop but I am not going to do that out of general principal. Sorry Channel4.
If you spend any amount of time on the Internet and are anything like me, you usually have a set of sites you visit on a regular basis. Here is a quick time saving tip if you use Firefox. Organize all your commonly visited web sites into a single "Daily Setup" file. I actually have two, one personal that opens all my favorite web sites that I usually read at least once a day and another for work which opens a bunch of work related sites. Then all you need to do is click on your Daily Setup bookmark, select the setup you want (personal or work), and click on "Open All in Tabs" and BAM you are off and running with your daily injection of Internet feed.
So typically I would not recommend auto login setting be set for your system but there are some exceptions. For example, when that system is a Virtual Machine running on your desktop.
I've been playing around a lot with my Ubuntu configuration on a virtual machine running under VMWare Fusion. I got tired of having to log into the machine every time I booted it when I was already logged onto my desktop of the host machine. So for the first time, I found myself trying to figure out how to auto login my Linux desktop. In my typical “learn the hard way” fashion, I over thought the problem and approach several times before discovering how simple it was. A couple of clicks to be exact.
Step 1: Fire up a terminal screen and type “ sudo gdmsetup ”
Step 2: This should bring up the “Login Window Preferences” dialog box. Click on the security tab, click the check box for “Enable Automatic Login”, select the user you want to Auto Login as and click close.
REDMOND, Wash., and CUPERTINO, Calif. â€” Oct. 31, 2006 â€” Microsoft Corp. and Zend Technologies Inc. today announced a technical collaboration to enhance the experience of running the PHP scripting language on Windows ServerÂ® 2003. The parties expect to extend the collaboration to the next version of Windows Server, code-named â€œLonghorn.â€ The resulting technology enhancements and ongoing interaction with the PHP community is expected to enable customers to take advantage of the Windows Server platform. The cooperative effort seeks to provide customers with richer functionality and better integration, resulting in improved performance and increased reliability.
Had a strange error message on one of the sites I manage. It said "Fatal error: Call to undefined function: pn_dbmsgerror()". This is a Post Nuke CMS system and the error occurred after a reboot to the physical server and we made some space on a partition that had filled up.
Turned out the problem was a corrupted table in the Post Nuke database. The table _referer got twisted and didn't know what it wanted to do with itself. This table is only used for tracking where people come from who visit your site, so since I had a back up of the database, I just nuked the table and copied over the table from the back up database. That got everything up and running again perfectly.
Do you need to change your web host or switch your database server? This is probably the only time when you really think of backing up your MySQL data. If you've got a website with a database or your custom database running for your applications, it is imperative that you make regular backups of the database. In this article, I will outline two easy ways of backing up and restoring databases in MySQL.
The easiest way to backup your database would be to telnet to the your database server machine and use the mysqldump command to dump your whole database to a backup file. If you do not have telnet or shell access to your server, don't worry about it; I shall outline a method of doing so using the PHPMyAdmin web interface, which you can setup on any web server which executes PHP scripts.
Playing with mysqldump
If you have either a shell or telnet access to your database server, you can backup the database using mysqldump. By default, the output of the command will dump the contents of the database in SQL statements to your console. This output can then be piped or redirected to any location you want. If you plan to backup your database, you can pipe the output to a sql file, which will contain the SQL statements to recreate and populate the database tables when you wish to restore your database. There are more adventurous ways to use the output of mysqldump.
A Simple Database Backup:
You can use mysqldump to create a simple backup of your database using the following syntax.
[username] - this is your database username
[password] - this is the password for your database
[databasename] - the name of your database
[backupfile.sql] - the file to which the backup should be written.
The resultant dump file will contain all the SQL statements needed to create the table and populate the table in a new database server. To backup your database 'Customers' with the username 'sadmin' and password 'pass21' to a file custback.sql, you would issue the command:
Just a good step by step on how to configure Apache to use a .htaccess file. I rarely ever use this method except in testing so I always forget.
First, get your web administrator to enable your use of .htaccess files. This requires a stanza in ServerRoot/conf/access.conf like this:
where /home/webber is replaced by your home directory. Without this, the usual default is AllowOverride None, which means that .htaccess files are ignored. The above stanza allows .htaccess control in all subdirectories of the specified Directory.
Set up a reasonably secure directory for the password (and optionally the group) files. This directory should not be in the web document tree! If it is, someone who can learn or guess the URL of the password file can fetch it and try to crack the passwords. (This refers to visitors from elsewhere on the Internet. There is no simple way to prevent users with accounts on the web server host itself from snooping in the password file, so we will have to settle for security by obscurity and trust them not to try too hard.)
Let us name this directory http-etc by analogy to the Unix /etc directory where the system passwd and group files reside. Place it in your home directory (not in public_html) so that it is outside URL space. Give it permission 701 = rwx-----x meaning you the owner can do anything, and the web server, running as the ordinary user apache, can access the directory but cannot list it (so it must know the file names in advance).1