• Mar
  • 06
  • 2015

How to “Un-Mount” (umount) a Remote SSHFS Drive on Mac OS 10.9

Posted by admin In Web Design | No Comments »

In my post, SSH functionality with RackSpace Cloud Sites we explore mounting a remote server using sshfs.

It is common to have issues with the connection fall asleep and then not being able to un-mount the drive and subsequently not being able to re-mount the drive.

After much searching I found the following solution for Mac OS 10.9.

Try the following:

umount -f <absolute pathname to the mount point>

If that does not work then try:

sudo umount -f /Users//Documents/mounted_dir 

NOTE: The command is umount NOT unmount. I killed about and hour before I figured THAT out.

If you still have trouble, you can kill the process and try again.

1) Find the culprit sshfs process:

$ pgrep -lf sshfs

2) Kill it:

$ kill -9 <pid_of_sshfs_process>

3) sudo force unmount the “unavailable” directory:

$ sudo umount -f <mounted_dir>

3) Unmount the culprit directory:

sudo umount -f /Users//Documents/mounted_dir 
  • Mar
  • 06
  • 2015

SSH functionality with RackSpace Cloud Sites via SSHFS

Posted by admin In Web Design | No Comments »

Working with RackSpace Clouds Sites has its advantages. One drawback though, is the lack of SSH access to you sites. This makes the routine advantages of SSH inaccessible. There is a work-around though. Enter sshfs. SSHFS is a way to mount a file system over SSH. The client interacts with the remote file system via the SSH File Transfer Protocol (SFTP) Once mounted you can use ssfs for the following: How to “Un-Mount” (umount) a Remote SSHFS Drive on Mac OS 10.9

If you are having trouble un-mounting your remote connection read this:

tar     gzip     gunzip     zip     unzip     chmod

Here is how SSHFS works

Make directory on your local computer that will be the “window” into the remote directory.

mkdir remote_home

Type this sshfs command from the terminal (command line).

sshfs me@www.myhost.com:/home/me/ remote_home

Take a look at the window directory you created.

ls -l remote_home/

Here is how to unmount the remote directory:

umount remote_home

How to install sshfs on a mac with homebrew

This article deals with installing sshfs on a Mac OS 10.X. If you wish to install sshfs on a different system I recommend you check out RackSpace’s link here.

Mac Installation Notes

You can use homebrew to install SSHFS.

brew doctor
brew update 
brew install sshfs 

How to fix the “the OSXFUSE file system is not available” error

You may get an the error “the OSXFUSE file system is not available”. If you get the error, the existing OSXFUSE file system is not playing nice with SSHFS. In that case try this:

brew info sshfs 

This sometimes yields additional information. In my case it told me to install a new version of the FUSE kernel and provided the syntax for the terminal commands.

# remove kernel extension:
sudo kextunload -b com.github.osxfuse.filesystems.osxfusefs
 
# Use new kernel extension:
sudo /bin/cp -RfX \
/usr/local/Cellar/osxfuse/2.6.2/Library/Filesystems/osxfusefs.fs /Library/Filesystems
sudo chmod +s \
  /Library/Filesystems/osxfusefs.fs/Support/load_osxfusefs 

After this, rssfs worked great.

  • Mar
  • 06
  • 2015

How To Compress a Directory with the tar Command and Exclude Certain Files, tarball exclude files

When creating a tar-ball (compressing a directory of files), you may exclude certain files by using the –exclude flag.

The –exclude flag will also work with a wild card “*”. Here is an example of how to tar-ball a directory from the parent directory and exclude certain files:

tar --exclude="node_modules"  --exclude=".*" -zcvf my-tar-ball.tar.gz ./<directory to compress>

IMPORTANT: Be sure to use the –exclude just after the tar command, NOT at the end of the command.

  • Mar
  • 05
  • 2015

FIXED: Grunt Dploy Not Pushing Files

Posted by admin In Errors & Fixes, Grunt, How To | No Comments »

Getting-started-with-GruntI was trying to set up Grunt to deploy to my development server.

According to the output in the terminal all went well. However no files were being uploaded.

$>grunt
Running "dploy:stage" (dploy) task
Connecting to ftp.my-dev-server.com...
Checking revisions...
Uploading files...
[ + ] File uploaded: /Users/MyUser/Sites/my-site-folder/dev/.rev:
Upload completed for ftp.my-dev-server.com
All Completed :)  

Take note of where it says:
Uploading files...
[ + ] File uploaded: /Users/MyUser/Sites/my-site-folder/dev/.rev:

When setting up the info in my grunt.js file, I had used the full path as output by the “pwd” command like this:

   dploy: {                                     
        stage: {                             
            host: "ftp3.ftptoyoursite.com",     
            user: "user-name",               
            pass: "password here", 
            scheme: "sftp",
            port: "22",   
            path: {
                local: "/Users/MyUser/Sites/my-site-folder/dev/", <-- full path is WRONG        
                remote: "/dev.okonlabs.com/web/content"
            }
        }
    

}
According to the feedback in the terminal (shown above) it "appears" to have worked just fine. However, this is WRONG.

Actually the grunt.js file needs to have the stage: path: local: value to be the path as seen from the project root. As in, the same place as where the grunt.js file resides.

   dploy: {                                     
        stage: {                             
            host: "ftp3.ftptoyoursite.com",     
            user: "user-name",               
            pass: "password here", 
            scheme: "sftp",
            port: "22",   
            path: {
                local: "/", <-- IMPORTANT!   
                remote: "/dev.okonlabs.com/web/content"
            }
        }
    

}

I hope that will save you a few hours of scratching your head. : )

  • Mar
  • 05
  • 2015

Dploy Error Bad Object How To Resolve

Posted by admin In dploy, Grunt | No Comments »

dploy, is a really useful way to deploy websites using FTP/SFTP and git from the command line.

Error: Command failed: /bin/sh -c git diff … fatal: bad object

While setting up your local working environment to use dploy, you may encounter a stubborn error related to a supposed bad object resulting from when dploy attempts to perform a git diff.

Connecting to ftp.my-development-server.com...
Checking revisions...
Checking diffs between [116fc716b06c4120bd0fda0e03331e72b481d432] > [216fc716b06c4120bd0fda0e03331e72b481d432]
An error occurred when retrieving the 'git diff --name-status 116fc716b06c4120bd0fda0e03331e72b481d432 216fc716b06c4120bd0fda0e03331e72b481d432' { [Error: Command failed: /bin/sh -c git diff --name-status 116fc716b06c4120bd0fda0e03331e72b481d432 216fc716b06c4120bd0fda0e03331e72b481d432
fatal: bad object 116fc716b06c4120bd0fda0e03331e72b481d432
]
  killed: false,
  code: 128,
  signal: null,
  cmd: '/bin/sh -c git diff --name-status 116fc716b06c4120bd0fda0e03331e72b481d432 216fc716b06c4120bd0fda0e03331e72b481d432' }

This might happen if you attempted to set it up, failed for some reason, and then start over.
Whatever the reason, do not jump to the conclusion that there is actually a bad object in the repo. Often, when git encounters an object it does not expect, it throws a bad object error.

How to tell

First check to see if there is a residual .rev file on the server dploy id attempting to “push” to. There is a good chance of this being the case if you have made more than one attempt to install dploy.

The way to check is simple to open the .rev file that is sitting on the remote sever and if it contains the hash if the “bad object”.

Take note of the bad object

In this case the error says:
fatal: bad object 116fc716b06c4120bd0fda0e03331e72b481d432

Open the remote .rev file

NOTE: Since this is a file name that starts with a dot “.”, it is a “hidden” file. You will need to make sure your FTP browser is allowing you to see hidden files.
Screenshot 2015-03-05 10.27.28
You can find the .rev file in whatever directory you are attempting to “push” to on the remote server

Take note of the object hash in the remote .rev file

It will not be hard to find as the .rev file only contains a single object hash.

How to resolve the issue

If the hash is the same as the one in the error message (ie. 116fc716b06c4120bd0fda0e03331e72b481d432), then delete the remote .rev file. Dploy will create another one.

I hope this was helpful to you. If so, please comment or share.

  • Mar
  • 04
  • 2015

How To Test Apache Configuration File httpd.conf

Posted by admin In Apache, How To, MAMP | No Comments »

If you have been edeiting the apache httpd.conf file, you can get valuable feedback by uing the following command:

apachectl configtest

This will return any errors or els return “OK”

  • Mar
  • 02
  • 2015

Git Workflow Explained

Posted by admin In Git | No Comments »

Just a quick post to recommend this good straight forward read on a successful Git workflow by Vincent Driessen

Here is an excerpt:

fb@2x

Next to the main branches master and develop, our development model uses a variety of supporting branches to aid parallel development between team members, ease tracking of features, prepare for production releases and to assist in quickly fixing live production problems. Unlike the main branches, these branches always have a limited life time, since they will be removed eventually.

Vincent even includes a PDF download of a git workflow.

  • Mar
  • 02
  • 2015

How To Fix: “Errno::ENOENT: No such file or directory @ rb_sysopen – undefined” Grunt Error

Posted by admin In Errors & Fixes, Grunt, How To, node | No Comments »

While setting up a grunt.js file, I ran into the following stubborn error.

Running "sass:dist" (sass) task
Errno::ENOENT: No such file or directory @ rb_sysopen - undefined
  Use --trace for backtrace.

After having Googled several solutions to the problem. I ended up updating my Ruby and uninstalling and re-installing nodejs among other things.

It turns out I had a misspelling in the directory of the key-data pair inside the grunt.js file.

I missed it because it was on a line that was off the screen.

This error can at times appear to be a more serious issue than it really is.

So remember to check your grunt.js file proper syntax first. You just might save yourself some trouble.

...
 // Project configuration.
  grunt.initConfig({
    pkg: grunt.file.readJSON('package.json'),
    sass: {                              
       dist: {                            
         options: {                      
           style: 'expanded'
         },
         files: {                        
           'path/to/the/css/content/style.css': 'destination/path/to/the/other/--->misspelling-here<---/css/content/style.css'
         }
        }
      },
...
  });
...
  • Feb
  • 27
  • 2015

How To Make Directory Recursive Command Line Make Directory Recursive mkdir -p

On occasion, like when working with ZendFramework, you may want to build a directory tree.
Did you know you can build a recursive directory tree?
Here’s how:

mkdir normaly works like this:

promt$> mkdir directory-name

directory-name

To build a directory and another directory inside of it you can do the following:

promt$> mkdir -p parent-directory/child-directory

directory-name/
    child-directory

You can even make a grandchild directory.

promt$> mkdir -p parent-directory/child-directory/grandchild-directory

directory-name/
    child-directory/
         grandchild-directory

Use the tree command to see the directory tree.
If you’re on a mac, you can install the tree command with brew.

promt$>brew install tree

Then try this:

promt$> tree parent*

parent-directory
└── child-directory
    └── grandchild-directory
  • Feb
  • 27
  • 2015

How To Fix 403 Forbidden Error from htaccess File

As a web developer, you will likely need to make rewrites for clean URLs and produce an htaccess file so you can create ModRewrite rules.

If you encounter a 403 Forbidden Error after creating the htaccess file. This happens because even though the htaccess file itself may have the right permissions, it is possible the web server is not explicitly allowing the rewrites for that directory.

To check for this you should take a look at the webserver error log. The webserver error log may be in different locations depending on your operating system. On Mac OS X it’s in /var/log/apache2/error_log, on most Linux boxes it’s in /var/log/httpd/error_log

for example, you can view the last few errors in the error log by using the tail follow command in the linux command line like this:

promt $> tail -f /var/log/apache2/error_log

If you see an error like this:

[Fri Feb 27 09:20:28 2015] [error] [client 127.0.0.1] Options FollowSymLinks or SymLinksIfOwnerMatch is off which implies that RewriteRule directive is forbidden: /Users/Leasure/Sites/zfSkeleton/public/

Then you can fix the issue by adding the following line to the top of the .htaccess file:

Options +FollowSymLinks

So that it may look something like this ( the other code should be code specific to your own set up). The important thing to know is that by adding Options +FollowSymLinks you explicitly allowing the rewrites.

Options +FollowSymLinks
RewriteEngine On
# The following rule tells Apache that if the requested filename
# exists, simply serve it.
RewriteCond %{REQUEST_FILENAME} -s [OR]
RewriteCond %{REQUEST_FILENAME} -l [OR]
RewriteCond %{REQUEST_FILENAME} -d
RewriteRule ^.*$ - [NC,L]
# The following rewrites all other queries to index.php. The
# condition ensures that if you are using Apache aliases to do
# mass virtual hosting, the base path will be prepended to
# allow proper resolution of the index.php file; it will work
# in non-aliased environments as well, providing a safe, one-size
# fits all solution.
RewriteCond %{REQUEST_URI}::$1 ^(/.+)(.+)::\2$
RewriteRule ^(.*) - [E=BASE:%1]
RewriteRule ^(.*)$ %{ENV:BASE}index.php [NC,L] 

If you are comfortable editing your Apache httpd.conf file, you can add the Options +FollowSymLinks to the directory directive like this:

<VirtualHost *:80>
ServerName skeleton.loc
DocumentRoot "/Users/Leasure/Sites/zfSkeleton/public"
   <Directory "/Users/Leasure/Sites/zfSkeleton/public">
       Options FollowSymLinks
       AllowOverride All
       Order allow,deny
       Allow from all
   </Directory>
</VirtualHost>