• Content Count

  • Joined

  • Last visited

About mikemansell

  • Rank
  1. Here's a pretty simple bash script for controlling bandwidth utilization; #!/bin/bash # This script will help you limit the amount of bandwidth that you consume so that you can predict/budget bandwidth fees # while using services such as the RackSpace Cloud which bill based on bandwidth utilization # Requires "vnstat" and "screen" # Maximum amount of bandwidth (megabytes) that you want to consume in a given month before anti-overage commands are run MAX=10240 # Interface that you would like to monitor (typically "eth0") INTERFACE="eth0" function getusage { DATA=`vnstat --dumpdb -i $INTERFACE | grep 'm;0'` INCOMING=`echo $DATA | cut -d\; -f4` OUTGOING=`echo $DATA | cut -d\; -f5` TOTUSAGE=$(expr $INCOMING + $OUTGOING) if [ $TOTUSAGE -ge $MAX ]; then logevent "`echo $TOTUSAGE/$MAX`mb of monthly bandwidth has been used; bandwidth-saving precautions are being run" iptables-restore < /etc/firewall-lockdown.conf else logevent "`echo $TOTUSAGE/$MAX`mb of monthly bandwidth has been used; system is clear for the time being" fi sleep 300 getusage } function logevent { STRINGBASE="`date +%d\ %B\ %Y\ @\ %H:%M:%S` -:-" MESSAGE="[email protected]" echo "$STRINGBASE $MESSAGE" >> aolog.txt } if [ $MAX == "" ]; then logevent "The maximum monthly traffic level (\$MAX) has not been defined. Please define this and restart." exit elif [ $INTERFACE == "" ]; then logevent "You have not defined the interface network (\$INTERFACE) that you want to monitor. Please define this and restart" exit elif [ "`whereis vnstat`" == "vnstat:" ]; then logevent "It appears that you do not have \"vnstat\" installed. Please install this package and restart." exit elif [ "`whereis screen`" == "screen:" ]; then logevent "It appears that you do not have \"screen\" installed. Please install this package and restart." exit fi if [ "$1" == "doscreen" ]; then getusage else logevent "Starting vnstat interface logging on $INTERFACE" vnstat -u -i $INTERFACE logevent "Initiating screen session to run as a daemon process" screen -d -m $0 doscreen fi
  2. Hello all; To start off, I should point out that I've been an avid user of Microsoft Office (primarily Excel) for quite some time now, but have become more and more disappointed with the last couple of releases. Having said that, I recently bought a MacBook Pro and have been using a copy of Microsoft Office from work, again using it for primary Excel. Having talked to a few people, I've been told that iWork is great for run of the mill end-users but that it really doesn't stand up to Microsoft Office for Mac in regards to the more robust features. So I guess my question is; is iWork worth looking into? Seeing as how it's a "generic" product (in comparison to MSO, at least) and hasn't seen a new release for two years now, I understand that it's not going to have to have all of the "new" features that I consider to be bloat in Office. But for you iWork users out there - especially those of you work work heavily with spreadsheets - do you find it to meet your day to day needs?
  3. Using Google Insights I recently began looking at the search terms that people use when looking for mobile phones. I found a number of things that show how people have compared mobile phones over time, and the products that they compare. With the success of the iPhone, I used it as the base of sorts and looked at the devices (BlackBerry, Android, Windows Phone) that people compared to the iPhone. Needless to say, I was really surprised at the number of people who compared the BlackBerry and iPhone devices, and think that it really speaks for itself that people have more recently fallen away from using the comparison. Does this show that the BlackBerry has lost relevance to everyday consumers? I also thought it would be interesting to see what features people looked for when looking for a phone. Using the "best phone for x" queries, I found that the importance of texting, camera, and video have more or less stayed the same over the last few years. But unsurprisingly, the importance of mobile apps seems to have jumped a lot starting in mid/late 2009. What do you think this means?
  4. #!/bin/bash # Screenshot and Upload Script # Requires gawk curl xsel scrot openssl imagemagick sshpass # This script uses OpenSSL to generate MD5 hases for filenames. # These hashes are based on the file itself and not the timestamp. # When all is said and done, this prevents users from systematically downloading your screenshots/uploads. ######################## # Upload Configuration # ######################## # "s3", "cloudfiles", "ssh", "ftp", "imageshack", "imgur" # Note for SSH users: if you have keys configured, leave the "login" and "pass" values blank. UPLOADMETHOD="" # Only used for SSH and FTP SERVER="" # S3 API key, CloudFiles user, FTP User, or SSH user when using sshpass LOGIN="" # S3 API secret, CloudFiles API key, Imgur API Key, FTP password, or SSH pass when using sshpass PASS="" # Container, S3 bucket, or server directory (not needed for Imageshack/Imgur) CONTAINER="" ##################### # URL Configuration # ##################### # "long", "bitly", "isgd", or "tinyurl" # CloudApp users must configure this option in their account URLMETHOD="" # API user for URL service (currently only needed for bit.ly) URL_APIU="" # API key for URL service (again, currently only needed for bit.ly) URL_APIK="" # The base URL that your uploads are stored at (not needed for Imageshack/Imgur) # THIS IS NEEDED EVEN IF YOU ARE USING A URL SHORTNER BASEURL="" ####################### # Misc. Configuration # ####################### # true/false - Apply a dropshadow to screenshot? Does *not* apply to file uploads. DROPSHADOW="" # true/false - Delete the local SCREENSHOT after upload? Doesn't apply to file uploads. DELETELOCAL="" # true/flase - Keep a local upload ledger? USELOG="" ################################################### ###### START THE SCREENSHOT AND UPLOAD PROCESS #### ################################################### SCRIPTDIR=`dirname "$0"` DATE=`date +%Y%m%d%H%M%S` if [ "$1" = "" ]; then ACTION="screenshot" FILE_FULL="$SCRIPTDIR/$DATE.png" scrot -s $FILE_FULL if [ "$DROPSHADOW" = "true" ]; then convert $FILE_FULL -gravity northwest -background 'rgba(255,255,255,0)' -splice 10x10 \ \( +clone -background gray -shadow 80x12-1-1 \) +swap -background none -mosaic +repage \ \( +clone -background gray -shadow 80x12+5+5 \) +swap -background none -mosaic +repage $FILE_FULL fi else ACTION="upload" FILE_FULL="$1" if [ ! -f $FILE_FULL ]; then zenity --error --text "\"$FILE_FULL\" does not exist. Script cannot continue." exit fi fi FILE="`openssl < $FILE_FULL md5`.`echo $FILE_FULL | awk -F . '{print $NF}'`" MIMETYPE="`file --mime-type $FILE_FULL | cut -d " " -f2`" CLASSIFICATION=`echo $MIMETYPE | cut -d "/" -f1` if [ "$UPLOADMETHOD" = "cloudapp" ]; then RESPONSE=`curl -G --digest -u $LOGIN:$PASS -H "Content-Type: application/json" -H "Accept: application/json" http://my.cl.ly/items/new | sed 's/${filename}//' | sed 's/\"params\"://' | sed 's/{//g' | sed 's/}//g' | sed 's/\":/\"=/' | sed 's/\":\"/\"=\"/g'` function gets3values { s3step=$1 if [ "$s3step" = "" ]; then s3step=1 fi item=`echo $RESPONSE | cut -d "," -f$s3step | sed 's/\"//g'` if [ "$item" != "" ]; then eval $item gets3values $(expr $s3step + 1) fi } gets3values S3RESPONSE=`curl -sS -D - -o /dev/null -F "key=$key$FILE" -F "acl=$acl" -F "Policy=$policy" -F "AWSAccessKeyId=$AWSAccessKeyId" -F "Signature=$signature" -F "success_action_redirect=$success_action_redirect" -F "[email protected]$FILE_FULL" "$url"` LOCATION=`echo -n $S3RESPONSE | grep -o http[^\<]* | cut -d " " -f1 | cut -d "&" -f1,2` CARESPONSE=`curl -G --digest -u $LOGIN:$PASS -H "Accept: application/json" "$LOCATION" | sed 's/:null/=\"null\"/g' | sed 's/:true/=\"true\"/g' | sed 's/{//g' | sed 's/}//g' | sed 's/\":/\"=/' |sed 's/\":\"/\"=\"/g'` function getcavalues { castep=$1 if [ "$castep" = "" ]; then castep=1 fi item=`echo $CARESPONSE | cut -d "," -f$castep | sed 's/\"//g'` if [ "$item" != "" ]; then eval $item getcavalues $(expr $castep + 1) fi } getcavalues LONGURL=$url elif [ "$UPLOADMETHOD" = "s3" ]; then RFCDATE="`date -R`" POLFILE="$SCRIPTDIR/POLFILE$DATE.tmp" echo { >> $POLFILE echo \"expiration\": \"2015-06-15T12:00:00.000Z\", >> $POLFILE echo \"conditions\": [ >> $POLFILE echo {\"acl\": \"public-read\"}, >> $POLFILE echo {\"bucket\": \"$CONTAINER\" }, >> $POLFILE echo [\"eq\", \"\$key\", \"$FILE\"], >> $POLFILE echo [\"eq\", \"\$Content-Type\", \"$MIMETYPE\"], >> $POLFILE echo [\"eq\", \"\$x-amz-storage-class\", \"REDUCED_REDUNDANCY\"] >> $POLFILE echo ] >> $POLFILE echo } >> $POLFILE POLICY="`base64 -w0 $POLFILE`" SIGNATURE="`base64 -w0 $POLFILE | openssl dgst -sha1 -hmac $PASS -binary | openssl base64`" curl "key=$FILE" -F "acl=public-read" -F "Policy=$POLICY" -F "AWSAccessKeyId=$LOGIN" -F "Signature=$SIGNATURE" -F "Content-Type=$MIMETYPE" -F "x-amz-storage-class=REDUCED_REDUNDANCY" -F "[email protected]$FILE_FULL" http://$CONTAINER.s3.amazonaws.com LONGURL="$BASEURL/$FILE" rm $POLFILE elif [ "$UPLOADMETHOD" = "cloudfiles" ]; then eval $(curl -s -X "GET" -D - \ -H "X-Auth-Key:$PASS" \ -H "X-Auth-User:$LOGIN" \ https://api.mosso.com/auth | gawk '$1=="X-Storage-Token:" { sub(/\r/,"",$2);printf("TOKEN=\"%s\"\n",$2); } $1=="X-Storage-Url:" { sub(/\r/,"",$2);printf("SURL=\"%s\"\n",$2) } $1=="X-CDN-Management-Url:" { sub(/\r/,"",$2);printf("CDN=\"%s\"\n",$2) }') curl -s -X "PUT" -T "$FILE_FULL" \ -H "X-Auth-Token: $TOKEN" \ -H "Content-Type: $MIMETYPE" \ $SURL/$CONTAINER/$FILE LONGURL="$BASEURL/$FILE" elif [ "$UPLOADMETHOD" = "ssh" ]; then if [ "$LOGIN" = "" ] && [ "$PASS" = "" ]; then scp -P 22 $FILE_FULL $SERVER:$CONTAINER/$FILE else RESULT=`sshpass -p $PASS scp -P 22 $FILE_FULL [email protected]$SERVER:$CONTAINER/$FILE` echo $RESULT fi LONGURL="$BASEURL/$FILE" elif [ "$UPLOADMETHOD" = "ftp" ]; then DIRNAME="`dirname $FILE_FULL`" BASENAME="`basename $FILE_FULL`" cd $DIRNAME ftp -n -v $SERVER <<- DONEWITHFTP user $LOGIN $PASS cd $CONTAINER put $BASENAME $FILE bye DONEWITHFTP LONGURL="$BASEURL/$FILE" elif [ "$UPLOADMETHOD" = "imageshack" ]; then if [ "$CLASSIFICATION" = "image" ]; then OLDFILE="$FILE_FULL" FILE_FULL="`dirname $OLDFILE`/$FILE" mv $OLDFILE $FILE_FULL LONGURL=`curl -H Expect: -F fileupload="@$FILE_FULL" -F xml=yes -# "http://www.imageshack.us/index.php" | grep image_link | grep -o http[^\<]*` mv $FILE_FULL $OLDFILE FILE_FULL="$OLDFILE" else zenity --error --text "Imageshack is an image hosting service. You are trying to upload a \"$CLASSIFICATION\" file. This simply cannot be done." exit fi elif [ "$UPLOADMETHOD" = "imgur" ]; then if [ "$CLASSIFICATION" = "image" ]; then OLDFILE="$FILE_FULL" FILE_FULL="`dirname $OLDFILE`/$FILE" mv $OLDFILE $FILE_FULL LONGURL=`curl -s -F "[email protected]$FILE_FULL" -F "key=$PASS" http://imgur.com/api/upload.xml | grep -E -o "<original_image>(.)*</original_image>" | grep -E -o "http://i.imgur.com/[^<]*"` mv $FILE_FULL $OLDFILE FILE_FULL="$OLDFILE" else zenity --error --text "Imgur is an image hosting service. You are trying to upload a \"$CLASSIFICATION\" file. This simply cannot be done." exit fi else zenity --error --text "This script is not configured to upload to \"$UPLOADMETHOD\". Please select a valid option and try again." exit fi if [ "$URLMETHOD" = "long" ]; then URL="$LONGURL" elif [ "$URLMETHOD" = "bitly" ]; then URL=`curl "http://api.bit.ly/v3/shorten?login=$URL_APIU&apiKey=$URL_APIK&format=txt&longUrl=$LONGURL"` elif [ "$URLMETHOD" = "isgd" ]; then URL=`curl "http://is.gd/api.php?longurl=$LONGURL"` elif [ "$URLMETHOD" = "vgd" ]; then URL=`curl "http://v.gd/create.php?format=simple&url=$LONGURL"` elif [ "$URLMETHOD" = "tinyurl" ]; then URL=`curl "http://tinyurl.com/api-create.php?url=$LONGURL"` else zenity --error --text "A valid URL method was not specified so no URL was copied to your clipboard." fi echo -n "$URL" | xsel --clipboard --input if [ "$DELETELOCAL" = "true" ] && [ "$ACTION" = "screenshot" ]; then rm $FILE_FULL else DELETELOCAL="false" fi if [ "$USELOG" = "true" ]; then LOG="$SCRIPTDIR/log.csv" if [ ! -f $LOG ]; then echo "unix_time,friendly_time,action,local_file,mime_type,service,container,item,url,deleted" >> $LOG fi echo "$DATE,`date +%Y-%m-%d_%H-%M-%S`,$ACTION,$FILE_FULL,$MIMETYPE,$UPLOADMETHOD,$CONTAINER,$FILE,$URL,$DELETELOCAL" >> $LOG fi
  5. If you are looking at using Amazon S3/CloudFront to host a static website, I have come up with a nice little script (requires s3cmd from http://s3tools.org/s3cmd) that allows you to create a bucket and distribution for your site in a relatively seamless fashion. What it does: Creates an S3 bucket (or uses an existing one) Uploads ALL of the files within a specified folder tree into the bucket Makes all of the items "public" Optionally creates a CloudFront distribution and sets a CNAME and default root object (e.g. index.html) based on your input. Below is the script. Obviously you're going to need s3cmd and an existing Amazon Web Services account with S3 and CloudFront enabled. The result will ultimately be a fast CDN at Amazon's very reasonable price. Be sure to save this as a bash script and set it to executable (chmod +x). clear echo "S3/CloudFront Website Uplaoder" echo echo "This utility will take advantage of AWS to host a website." echo "The end result will be a cost-effective CDN-based website." echo echo "Please note:" echo " - You must have an existing AWS account w/ S3 + CF" echo " - You must have s3cmd 1.0.0 + installed on this computer" echo " - Your API info can be found in your AWS account settings" echo " - All files will be public and filetypes will be guessed" echo " - An S3 bucket will be created if it doesn't exist" echo " - A CloudFront distribution can be created if specified" echo echo "--------------------------------------------------------------" echo echo "Please enter your API key:" read API_KEY echo echo "Please enter your API secret:" read API_SEC echo echo "Please enter the local folder that contains your site:" read LOC_DIR echo echo "Please enter the S3 bucket where the site will be uploaded to:" read REM_BUC echo echo "Set up a CloudFront distribution - 'true' or 'false'" echo "Use 'true' if you want to take advantage of edge locations or default root objects." echo "DO NOT use 'true' if you have an existing distribution on the given bucket." read CFT_SET if [ "$CFT_SET" = "true" ]; then echo echo "Please enter the default root object (index.html) for ClodFront:" read CFT_DRE echo echo "Please enter the CNAME that you will use for the CF distribution:" read CFT_CNM fi echo echo "--------------------------------------------------------------" echo echo "Checking to make sure that all of the needed info was input..." echo if [ "$API_KEY" != "" ] && [ "$API_SEC" != "" ] && [ "$LOC_DIR" != "" ] && [ "$REM_BUC" != "" ]; then echo "All of your information appears to have been entered." echo "Mind you, the validity of this information has yet to be validated." echo echo "--------------------------------------------------------------" echo else echo "YOU ARE MISSING FIELD DATA. TRY AGAIN." echo echo "--------------------------------------------------------------" echo exit fi echo "Making sure that your local directory is valid..." echo if [ -e $LOC_DIR ]; then echo "Local directory has been validated!" echo echo "--------------------------------------------------------------" echo else echo "THE LOCAL DIRECTORY YOU SPECIFIED DOES NOT EXIST. TRY AGAIN." echo echo "--------------------------------------------------------------" echo exit fi echo "Making sure that you have s3cmd installed..." echo if [ "`whereis s3cmd`" != "s3cmd:" ]; then echo "s3cmd appears to be installed. All is good." echo echo "--------------------------------------------------------------" echo else echo "S3CMD IS NOT INSTALLED. TRY AGAIN." echo echo "--------------------------------------------------------------" echo exit fi echo "Creating validation file..." echo S3_FILE="`dirname $0`/`date +%Y%m%d%H%M%S`s3cmd.tmp" echo "$S3_FILE" echo "access_key = $API_KEY" >> $S3_FILE echo "secret_key = $API_SEC" >> $S3_FILE tail $S3_FILE echo echo "--------------------------------------------------------------" echo echo "STARTING THE UPLOAD PROCESS!" echo s3cmd --config=$S3_FILE mb s3://$REM_BUC s3cmd --config=$S3_FILE sync --acl-public --guess-mime-type $LOC_DIR/ s3://$REM_BUC if [ "$CFT_SET" = "true" ]; then s3cmd --config=$S3_FILE cfcreate s3://$REM_BUC s3cmd --config=$S3_FILE cfmodify --cf-default-root-object=$CFT_DRE s3://$REM_BUC CFT_DMN="`s3cmd --config=$S3_FILE cfmodify --cf-add-cname=$CFT_CNM s3://$REM_BUC | grep DomainName`" fi echo echo "--------------------------------------------------------------" echo echo "The details for your new configuration is as follows:" echo "S3Bucket: $REM_BUC" if [ "$CFT_SET" = "true" ]; then echo "$CFT_DMN" echo "DfltRtObjt: $CFT_DRE" echo "CNAME: $CFT_CNM" echo echo "You will want to create a DNS record pointing your CNAME to your CloudFront distribution ('DomainName')." echo echo "CloudFront takes a bit of time to set up." echo "Patience is a virtue." fi echo echo "--------------------------------------------------------------" rm $S3_FILE
  6. I'm trying to use as few resources as possible on my WordPress installation. Is there anything I can do to disable post revisions? These seem to take up a bit of space and are not really necessary for what I need. Do you have any other tips for making WordPress a more efficient platform?
  7. Depending on your network card, you will likely have to install the drivers. In Ubuntu, this is often a trivial task. First off, you will need to plug the device into a hard-wired network connection (ethernet) so that it has availability to the internet. Then you will want to do two things. First, ensure that you have the non-free repositories configured. The easiest way to do that is head to the terminal ("Applications" -> "Accessories" -> "Terminal") and type in "sudo gedit /etc/apt/sources.list". You will need to enter your password, and from there a text editor window will open with a list of repositories. You will likely want to replace the contents of the file as follows (assuming you're on 10.10): deb http://us.archive.ubuntu.com/ubuntu/ maverick main restricted deb-src http://us.archive.ubuntu.com/ubuntu/ maverick main restricted deb http://us.archive.ubuntu.com/ubuntu/ maverick-updates main restricted deb-src http://us.archive.ubuntu.com/ubuntu/ maverick-updates main restricted deb http://us.archive.ubuntu.com/ubuntu/ maverick universe deb-src http://us.archive.ubuntu.com/ubuntu/ maverick universe deb http://us.archive.ubuntu.com/ubuntu/ maverick-updates universe deb-src http://us.archive.ubuntu.com/ubuntu/ maverick-updates universe deb http://us.archive.ubuntu.com/ubuntu/ maverick multiverse deb-src http://us.archive.ubuntu.com/ubuntu/ maverick multiverse deb http://us.archive.ubuntu.com/ubuntu/ maverick-updates multiverse deb-src http://us.archive.ubuntu.com/ubuntu/ maverick-updates multiverse deb http://archive.canonical.com/ubuntu maverick partner deb-src http://archive.canonical.com/ubuntu maverick partner deb http://extras.ubuntu.com/ubuntu maverick main deb-src http://extras.ubuntu.com/ubuntu maverick main deb http://security.ubuntu.com/ubuntu maverick-security main restricted deb-src http://security.ubuntu.com/ubuntu maverick-security main restricted deb http://security.ubuntu.com/ubuntu maverick-security universe deb-src http://security.ubuntu.com/ubuntu maverick-security universe deb http://security.ubuntu.com/ubuntu maverick-security multiverse deb-src http://security.ubuntu.com/ubuntu maverick-security multiverse Go ahead and save this file, then run the command "sudo apt-get update" Next, you will simply want to tell Ubuntu to search for available hardware drives. To do so, navigate to the "System" menu, then select "Administration". From there, select the "Additional Drivers" button. A window should load, and after a minute or so you will likely see your network card as an available drive. Go ahead and activate it, let Ubuntu do its thing, and restart the computer. I've attached an image showing what the "Additional Drivers" window should look like. If you have any difficulties, be sure to post back. Better yet, if you cannot get the network card to detect, post its model number. We may be able to further assist you with this information.
  8. Leaving my computer on for long periods of time - even when I'm not using it - I'd like to know if there is a way that I can tell the computer to de-fragment itself every so often so that I don't have to worry about doing so when I actually need to use the computer. How can I do this?
  9. Is VirtualPC really the best out there? Or would you recommended looking at something else?
  10. What/where are the best Cyber Monday deals? Any online retailers that you've had particularly good experience with in the past?
  11. So. The question is in the topic. Joomla? Drupal? Wordpress? Which one should I use for a website?
  12. Go here. http://www.google.com/support/forum/p/Chrome/thread?tid=36ee1bedc8acf004&hl=en Thanks
  13. Anyone know how to set up Windows to allow for hibernation?
  14. Mandy, this post was hilarious and awesome. But, that makes sense, because an awesome person like Jeff deserves an awesome birthday post