Linux Sysadmin Blog

Running ASDM Client From Your Linux Desktop

- | Comments

For those of us who use Linux as a desktop replacement for Windows would find it nice to be able to run the ASDM client natively. If you haven’t upgraded your ASA/PIX to the latest ASDM you should do so. The steps are described here.

Once you do upgrade your PIX/ASA to the latest version you may run into another issue where your bundled version of java not connect with ASDM. The remedy for this if you are using Fedora 10 can be found here.

Now assuming that you have asdm loaded and opened access to outside over port 4443 with:

1
2
http server enable 4443
http 0.0.0.0 0.0.0.0 outside

You can connect to your PIX/ASA over port 4443 and download the asdm.jnlp file to your pc via web browser at https://external_ip_of_asa:4443

Once you posses the asdm.jnlp file issue javaws asdm.jnlp in terminal. Upon successful login this will create .asdm folder in your home directory with files inside, as well as a desktop shortcut. After this the asdm.jnlp is no longer needed and can be erased.

Force Url to Use SSL/https

- | Comments

In some cases you would want to have your site use SSL (https://) at all times you can do this by using:

1.) Using Htaccess/mod_rewrite. You only need to create a .htaccess file on your home directory and add the codes below:

1
2
3
4
5
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteCond %{HTTPS} !=on
RewriteRule ^/(.*) https://%{SERVER_NAME}/$1 [R,L]
</IfModule>

The above codes may not work on other Apache/php setup but i’m not sure what’s the exact configuration variable for that.

Anyway, here are my alternatives. Either of them is fine if you’re running http and https on standard ports (http=80, https=443), otherwise change the value to your custom http or https port. Change domain.com to your domain.

1
2
3
4
5
6
7
8
9
10
11
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteCond %{SERVER_PORT} ^**80**$
RewriteRule ^(.*)$ https://**domain.tld**/$1 [R,L]
</IfModule>

<IfModule mod_rewrite.c>
RewriteEngine On
RewriteCond %{SERVER_PORT} !^443$
RewriteRule ^(.*)$ https://**domain.tld**/$1 [L,R]
</IfModule>

2.) PHP function. If your site use PHP you can redirect the url to SSL/https using this function:

1
2
3
4
5
6
7
8
9
<?php
function ForceHTTPS(){
if( $_SERVER['HTTPS'] != ``"on")   {
//if( $_SERVER['SERVER_PORT'] == 80 )   {  _**<<-- use this line if the above will not work.**_``
$new_url = "https://" . $_SERVER['SERVER_NAME'] . $_SERVER['REQUEST_URI'];
header(``"``Location: $new_url``"``);
exit; }
}
?>

If you are using an application/script wherein you can enter the settings for site url (either from database or config file), it is better to use that settings.

Let me know if you run into issues, maybe i can help. :)

Drupal Performance Tips From DrupalCon

- | Comments

Still reporting from DrupalCon. So far there have been a number of sessions I have attended. Here are some highlights from those sessions on how to increase performance on your drupal site.

  • Look at the number of requests a page makes to the server
  • Use yslow to measure page rendering (often a page performance is perceived, not just based on the server response time)
  • Remove search, use alternate solutions such as Apache Solr or Google Search API
  • Use CDN as much as possible
  • Use Reverse Proxy Cache and memcache
  • Obviously use drupal cache

Some other notes that are somewhat related to drupal performance and site performance management in a clustered hosting environment.

Manual updates and rollback

OLD WAY: tar, move/copy untar restart services OLD WAY: rsync BETTER WAY: Capistrano

Managing systems:

BETTER WAY: bcfg2

Monitoring Tools

  • Capacity Load: analyzing trends, predicting load, checking results of configuration and software changes (cacti, munin)
  • Failure: analyzing downtime, notification (nagios - using nrpe agents to monitor diverse services (do we use it this way?) , hyperin)

Use monitoring tools to closely observe cluster replication and cashing as the failures in this area are the most difficult to solve.

Apache Solr Drupal Integration

- | Comments

I am at Drupal Conference attending the Acquia Apache Solr presentation. This integration has a lot of promise in my opinion.  The drupal search is not that useful, we’ve actually replaced drupal search on our implementations with google custom search.

Apache Solr is an open source project:

Solr is an open source enterprise search server based on the Lucene Java search library, with XML/HTTP and JSON APIs, hit highlighting, faceted search, caching, replication, a web administration interface and many more features. It runs in a Java servlet container such as Tomcat.

Apache Solr has a lot of promise improving the search results which, considering the down sides of drupal search, would greatly improve the user experience.

Acquia has the Apache Solr search service in beta right now and it will be offered as a hosted offering.

We saw a preview of the www.drupal.org site redesign and it definitely looks like they will be using the Apache Solr.   The demo of the search results page looked very promising, with features such as search suggest, filter and more features to come.

Free Trial of Cloud Computing

- | Comments

For those of you who have heard all the hype about cloud computing but haven’t dug your hands into it yet there is a company offering free trials of cloud servers: http://www.rightscale.com/products/free_edition.php. They offer 10 hours of playing around with cloud servers on Amazon EC2 which amounts to a whopping $1 value but the good news is that you only have to put in an e-mail address to register. The images they offer are quite extensive and using them it will give you a good insight into what the limitations of using cloud servers are. I found it very useful to see that are pending times before a server actually gets started and something similar when terminating a server. These times are usually several minutes and should definitely be taken into account when the server needs to get started in a hurry.

Additionally I have to say that the scripting abilities offered by RightScale seem quite extensive and very useful.The scripting allows for some very powerful ways of setting up servers. The prices associated with those services however seem a bit steep compared to the bare bones services offered by Amazon.

Even the Clouds Come Down to Earth - Cloud Services Crash Just Like Everyone Else Sometimes

- | Comments

During our weekly sysadmin call this morning several of our experienced sysadmins quickly pointed out that clients seeking very high up time should not necessarily look for it on the cloud.  I couldn’t believe it, but almost as an omen, this story came on my RSS feed from Webware:  Google apologizes for email outage

Outages pose problems for Google as it tries to persuade companies to buy into its cloud-computing vision, in which applications are hosted on the Internet rather than on corporate computers. But Google argues its service availability is competitive with most organizations’ abilities to run their own e-mail servers.

Clearly the google cloud isn’t going to be the only one having an outage from time to time.  It seems to me that while still in the infancy, these services are vulnerable to unexpected problems, kind of like the famous first internet worm - the Morris worm -  that brought the internet to its knees back in the 1980s.

While cloud services offer a lot of promise, and overall should offer a better level of redundancy and up time, this shows that the cloud is also not immune to some down time.

Here is the link to the original Google cloud problem blog post.

Ubuntu 9.10 “Karmic Koala” Will Use Eucalyptus for Your Own Cloud Computing Solution

- | Comments

Mark Shuttleworth announced last Friday that Ubuntu 9.10 will be named Karmic Koala, and also presented what we should expect from the future version of Ubuntu:

  • the desktop version will have a new look (more beautiful)

  • on the server side as everybody these days Ubuntu’s future will be targeting the “cloud”. They will have official supported Ubuntu Amazon EC2 AMIs, as ready-to-run appliances.  Also they will support the open source project Eucalyptus that enables you to create an EC2-style cloud using your own hardware. Eucalyptus, will be included in ubuntu repos allowing users to build and manage private clouds more easily.

For full details: https://lists.ubuntu.com/archives/ubuntu-devel-announce/2009-February/000536.html

Nagios: How to Check if Remote Process Is Running

- | Comments

We have a monitoring server running Nagios and we needed to add checks for Nginx process on a new server.  Basically, you only need to install NRPE to monitor services, processes, disk space, load, etc on your remote machine.  Check the NRPE docummention for complete reference and here’s a quick NRPE installation guide for Debian.

For my objective i only need to check if Nginx process is running and will use the check_procs.  NRPE and Nagios Plugins were installed and i can check the Nginx process locally using the following commands:

1
/usr/local/nagios/libexec/check_procs -c 1:30 -C nginx

wherein: -c 1:30 <– refers to the Critical range for number of Nginx processes. If there process count is below 1 and above 30 this will send me a Critical notice.  If you wan to add a Warning level you can use “-w 1:25” - adjust the number of processes for you needs. -C nginx <– this will check for the command name (nginx)

NOTE: For complete reference on this check and other samples please refer to the NagiosWiki page.

Below are my configurations:

NRPE(remote): /etc/nagios/nrpe_local.cfg

1
command[check_nginx]=/usr/local/nagios/libexec/check_procs -c 1:30 -C nginx

Nagios(host):  /usr/local/nagios/etc/objects/localhost.cfg

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
define service {
use                            generic-service         ; Name of service template to use
host_name                      HOST/IPADDRESS
service_description            CHECK_NGINX
check_period                   24x7
max_check_attempts             3
normal_check_interval          5
retry_check_interval           3
contact_groups                 Admins
notification_interval          480
notification_period            24x7
notification_options           w,u,c,r
check_command                  check_nrpe!check_nginx
notifications_enabled          1
}

Nagios version is 3.0. Nagios monitoring and remote server are running Debian Etch.

Best Definition of Cloud Computing - to Date

- | Comments

The definition of cloud computing has been rather elusive. Some CEO’s like, Larry Elison are frustrated because they claim that their sofrware already runs on the cloud. Others, like the the slash dot rhino herd has made this cloud computing definition post one of the most popular of last year:

“Even though IBM’s Irving Wladawsky Berger reports a leading analyst as having said recently that ‘There is a clear consensus that there is no real consensus on what cloud computing is,’ here are no fewer than twenty attempts at a definition of the infrastructural paradigm shift that is sweeping across the Enterprise IT world — some of them really quite good. From the article: ‘Cloud computing is…the user-friendly version of grid computing.’ (Trevor Doerksen) and ‘Cloud computing really is accessing resources and services needed to perform functions with dynamically changing needs. An application or service developer requests access from the cloud rather than a specific endpoint or named resource.’ (Kevin Hartig)”

But this month, the Electrical Engineering and Computer Sciences University of California at Berkeley has put a highly acclaimed paper together on the cloud: Above the Clouds: A Berkeley View of Cloud Computing.

We uploaded it here also: Above the Clouds: A Berkeley View of Cloud Computing - a 25 page pdf document.

Here is how Berkely sees it: 

Cloud Computing refers to both the applications delivered as services over the Internet and the hardware and systems software in the datacenters that provide those services.

The services themselves have long been referred to as Software as a Service (SaaS).

The datacenter hardware and software is what we will call a Cloud. When a Cloud is made available in a pay-as-you-go manner to the general public, we call it a Public Cloud; the service being sold is Utility Computing.

We use the term Private Cloud to refer to internal datacenters of a business or other organization, not made available to the general public.

Thus, Cloud Computing is the sum of SaaS and Utility Computing, but does not include Private Clouds.

The paper is great - it is interesting to read, has some great quotes and it lists opportunities and the current top 10 obstacles to cloud computing.  Highly recommended.