Thursday 20 November 2014

Sports Retail Management - client branded clothing website

We recently developed a new website for Sports Retail Management, a garment supply company who brand up clothes for all sorts of clients. This was a fun project (note the lack of "air quotes" it was quite a challenge, but one things we passed onto them was a lesson learned from days of yore...

This website is a brand new one for us, working closely with the client we created a combination site which has a public element to it coupled with a fully functional client extranet. The client Sports Retail Management, came to us with a simple remit, allow our local authority clients to log onto the website under their own account and create garment orders. "Is that it?" Well, kind of... It then transpired that some clients have some discounts afforded to them based on the level of purchasing they do.....ok, that's not a problem. Oh and each garment can have several colours and sizes. Yeah that's fine. Except that some sizes have different pricing models.

OK so this project started to look a little more complicated than initially envisaged so we started to look at how the system would work in the hands of the user, we started with a skeleton of the functional elements and imagined (note that) what features might be useful for the Authority users.

Side note: we pride ourselves on (mostly) listening to our clients regarding what they want and adding value to the systems we develop to make them as intuitive and user-friendly as we can. One of the things we have learned is one, don't overthink something - brainstorming something is useful but only come to conclusions if you are sure the end-user will see and use the value of your efforts. A number of times I've seen projects (which we have inherited) with loads of bells and whistles, so many if them in fact that you can't get to grips with the underlying functional purpose of the project. A developer who can see the actual value (rather than perceived value) and then apply this into functionality is worth their weight in gold, every time.

So, back to the story, we sat down with the end user and asked them what they would use. This was an eye opener - it's all too common to hear our client ask for this and that, without really knowing what their client would see as the priority feature.

In this case there was a large degree of overlap, with one or two things which the client in particular wanted which were not mentioned at all during the development specification phase.

So a lesson learned, not in an accusing manger it's just that sometimes you would be best advised speaking with the end user as the client, like you, will only imagine what can be the best features for a site which is heavily used by clients.

There endeth the lesson...

Saturday 20 September 2014

Doing the backup shuffle - removing backups with PHP and CRONTAB jobs

If you, like me, use backups for MySQL or SQL (I guess we all probably should!) then you can do daily backups of your .sql or .bak databases dumps and put them somewhere where you can access them in the (hopefully lack of) event of a disaster occurring with your database or server.

Now, I run a script (like the one I explained in an earlier post for Scheduled SQL Server Backups ) which like most good ideas for backups, takes them away from the actual server - it never ceases to amaze me how some people consider a backup to the same server as the database as a valid way of performing incremental backups, anyway...so I have the SQL files backed up offsite, fine.

However, the database files I have backed up are becoming quite..chunky in size, one bumping the Gb mark and a few others climbing in size as content and complexity are added. This can cause a problem on the server I save the backups to, as I only pay for a limited amount of disk storage space and to be honest I (and my clients) will be unlikely to want to retrieve or restore a database which is older than a weeks worth of daily backups so I needed a way of easily trimming the list and keeping the backup number at a manageable size.

Getting the plan together


So looking at the specification of what I wanted to do:

  1. list the files I want to examine
  2. identify those older than 5 days
  3. delete them
  4. send me an email to let me know the task ran ok
  5. commit the job to Crontab
OK so first things first. The PHP we needed to use would list the files, look at modified date and remove (unlink in PHP terms) from the folder, easy enough:

$dateminusone = date("Y/m/d", strtotime( '-5 days' ) );
$list = glob('*.zip');
usort(
   $list,
   create_function('$a,$b', 'return filemtime($a) - filemtime($b);')
);
foreach($list as $file)
{
 $filemoddate = date('Y/m/d', filemtime($file));
 deloldfile($file,$filemoddate,$dateminusone);
}
sendconfirmation("me", "
my email address");



breaking it down

$dateminusone = date("Y/m/d", strtotime( '-5 days' ) );

sets a variable to set a time to delete back to:

$list = glob('*.zip');

the PHP5.3 function "glob()" allows us to list files of a certain type in a particular folder

usort(   $list,   create_function('$a,$b', 'return filemtime($a) - filemtime($b);'));

I used the usort element in the script as I initially echoed the list back to the screen, but it was a handy way of listing the items in date ascending order according the modified time functio: filemtime(). After creating the usort array of files we iterated through the list with the foreach() loop and ran an internal function to delete the file

deloldfile($file,$filemoddate,$dateminusone);

If we look at the function deloldfile() in its glory we can see that it does a quick check on the modified date versus the set date and if it's older, it's deleted:

function deloldfile($file,$mdate,$startdate){
if( $startdate>$mdate ){
 unlink($file);
 #echo "marked for deleted";
 } else {
 #echo ":)";
 }
}


Once that's done and the loop is complete, we just ping an email to me to tell me that it ran ok

sendconfirmation()

The function which emails me is a simple little SMTP email script which makes it more likely to be sent through an authenticated email account:

function sendconfirmation($name, $address){
    $email =
'emailaddress;
    $name = 'backup task' ;
    $subject = 'SQL backups deleted successfully' ;
    $content = '<html><body style="font-family:arial, sans-serif; font-size:12px">The SQL backup delete task ran successfully on :'.date("D dS M,Y h:i a",time())."<br />";

require_once('class.phpmailer.php');
$mail   = new PHPMailer();
$mail->IsSMTP();
$mail->Host       = "smtp_mail_server_name";
$mail->SMTPDebug  = 1;
$mail->SMTPAuth   = true;
$mail->Port       = 25; 
$mail->Username   = "smtp_email_address_account_name
";
$mail->Password   = "account_password";  
$mail->SetFrom($email, $name);
$mail->AddReplyTo($email, $name);
$mail->Subject    = $subject;
$mail->MsgHTML($content);
$sendaddress = $address;
$sendname = $name;
$mail->AddAddress($address, $sendname);
 if(!$mail->Send()) {
   echo "Mailer Error: " . $mail->ErrorInfo;
 } else {
 
 }
}


This uses the excellent SMTP PHPMailer library and I've never had any problems with this script.

Placing it all together and we have a working script which runs perfectly, so how can we automate this? We will create the Linux version of a Scheduled Task through a cron job. As my servers run off Plesk and Centos the UI is presented quite nicely and so long as you do the right set up of the task you should have no problems.

Creating the Cron Job


In Plesk the cron jobs are set up as "scheduled tasks" under tools and settings (10 & 11)

Click on Tools and Settings and then Scheduled Tasks and then set one up as the Root user (I've found problems can occur when running system commands as other users)

Click on Add a Scheduled Task and you should see something quite daunting like this completely unhelpful screen:


A personal word here, I've come to conclusion that Parallels and Plesk vendors do not want any old wannabe to access system things like tasks so they seldom offer any advice for setting them up: if yo mess it up then they have no wish to be held to account.

So setting up a cron job needs a little further explanation:

* means ALL THE TIME

that's the main explanation over. Seriously though, if you place * in all the fields for times the task will run EVERY minute until doomsday (or your server dies whichever is sooner) so you need to give some serious consideration to the frequency you want to execute the task.

For my situation, I wanted to run it once a week, around 11.00am on a Saturday so I have this:


This means that a 1 minute pass 11.00 every month on a Saturday my command will be executed.

So the command (my URL is blurred out) is:

/usr/bin/wget -O - -q http://URL_TO _RUN_FROM/deletebackups.php > /dev/null 2>&1

We want to run a command so we'll fire the WGET command line utility to open a fully declared web page ( A word to the wise, I attempted to set up cron jobs which fire a URL type command and have only ever got this to work well with WGET at the command line)

/usr/bin/wget because in the server these command line utilities tend to reside in the usr/bin folder

There are a number of switches which are available to append to the WGET directive, we are using:

-O - -q

-O : we need to tell the script that it will expect a file to have to interact with

-q : we want to run it in quiet mode, we don't really care what the response is as that is captured by the email alert at the end of the script

http://URL_TO _RUN_FROM/deletebackups.php kinda obvious but a fully qualified domain name and executable script

..and finally, your friend:

 > /dev/null 2>&1

If you leave this out and your email address is the root address for your server you will get an email each time this script runs, not a problem if you are running it weekly, but some of my scripts run each 15 mins, I personally can't be bothered with getting emails each time every scheduled task runs!

Add in   > /dev/null 2>&1 and this blocks in default email owner function.

Save that job and let it run!

Wednesday 3 September 2014

Chunky iPad cover for little handies...

A client came to us last year with her idea for a foam cover for her iPad, having gone through a few devices with her kids dropping, dinging or otherwise losing their grasp on them. Having done some extensive research on what was out there, there is a real need for something which will provide a decent alternative to cheap and (bizarrely) fragile options already out there.

Enter "fatframe" - an easy to insert iPad children's cover which when she showed me it in action had me initially horrified and then impressed, horrified as she proceeded to drop it down her stairs ("it's an iPad!!") and then really very impressed when it was fine and none the worse off for it's brief journey.

Fatframe really does seem to have found itself a niche, as we are all acutely aware, iPads are not cheap, and whilst the iPad case itself won't protect against the face being dinged against a table corner, an inadvertent tumble onto the garden path, down the stairs or on a tiled kitchen floor (ahem, personal experience) will give it a little more chance of being saved from pricey screen replacements or, worse case scenarios, a brand new device! Less that 10% the cost of a new iPad, no brainer as far as I'm concerned!

Pop over to the fatframe website and see what they offer - there's a stand for tea time viewing and a forthcoming seat attachment and waterproof cover (don't get me started on that one - iPhone/stand up toilet visit/not enough hands, let's leave it there...)

Wednesday 9 July 2014

jQuery, the beautiful and the ugly

Sometimes, you can have too much of a good thing. I coded up an image gallery which had a whopping amount of jQuery in it, handling image uploads, adding them to a database, naming them, assigning them to web pages, enabling them as backgrounds.

Even with the "live" or "on" functionality of jQuery I was finding it tough to get a large number of images to pick up the functional click events and transfer those to internal status changes within the array of images which were being looped from the system folders as well as the database. It all got a little too much to handle.

I was faced with a dilemma, do I unpick all the jQuery code I had assembled to rationalise it (it WAS doing what I wanted to do it just wasn't updating internal flags or try something else?

In the end, it was clear that if I reloaded the page following the click event, the internal status flag was then being picked up and the change was visible. So instead of an asynchronous call to the status handler, why didn't I just reload the page?

Well, dear reader, that's what I did - I'm not proud but by heavens it saved me another 4 hours of coding time!

An example is here (snippet)


  $(".removefromgallery").live("click",function(){
     var imageref = $(this).attr("rel");
     var imagefile = $(this).attr("id");
     //alert('id = '+ imageref +', image name = '+imgname+', image file = '+imagefile); return false;
     {
    
       //return false;
       var dataString = 'img='+imagefile;
     //alert (dataString);return false;
    
         $.ajax({
         type: "GET",
         url: "remove.php",
         data: dataString,
         success: function(response) {
   location.reload(true);
        });
      }
  });


The location.reload(true) function simply reloads the current URL in it's entireity (complete with querystring values)

Sometimes, simple does cut it!

Friday 23 May 2014

Loading Remote Content into Bootstrap Modal

Another one which puzzled me for a short while after stumbling and bumbling my way through the awesome Bootstrap framework which provides so many useful front-end tools to help develop the user interface, was loading remote content into a bootstrap modal.

On the surface of the system looked very easy: declare a target in the href and of you go (go to http://getbootstrap.com/javascript/#modals for a summary:

<button class="btn btn-primary" data-toggle="modal" data-target="#modaltoopen">open</button>

This will style up a simple link as a medium sized blue button which when clicked will open a modal window if the following code is enclosed in the body:

<div id="modaltoopen" class="modal fade" tabindex="-1" role="dialog" aria-labelledby="myLargeModalLabel" aria-hidden="true">
  <div class="modal-dialog modal-lg">
    <div class="modal-content">
      ...
    </div>
  </div>
</div>

<!-- Small modal -->
<button class="btn btn-primary" data-toggle="modal" data-target=".bs-example-modal-sm">Small modal</button>

<div class="modal fade bs-example-modal-sm" tabindex="-1" role="dialog" aria-labelledby="mySmallModalLabel" aria-hidden="true">
  <div class="modal-dialog modal-sm">
    <div class="modal-content">
      ...
    </div>
  </div>
</div>


OK, so good so far (assuming you've also included the bootstrap.js and css files in the same page.

But what if you wanted to pull in a bunch of data, say if you had a loop for a series of records from a database. In our example we have a bug tracking system which looks a little like this:


so that when the blue button is clicked on we can pull in a template page into a modal dialog to let the user see the content of the bug reported, thus:



Remote Content
Supposedly, pulling in a page is an easy process, and to be honest it is but with a few caveats. If you want to access a URL you should just apply an href into the link and this will in turn grab the page and place it into the modal pop-up.

Something like this:

<a  data-toggle="modal" href="loadbug.php?id=<?php echo $ibugid; ?>" data-target="#modeltoopen" class="btn btn-xs btn-info">view info</a>

This works fine, except that identifying the modal popup in the data-target attribute does exactly what it should, it adds the content into the modal pop-up but overwrites everything in the modal:


What this does is effectively grab the content from the remote HTML (or this case PHP) file and replaces the modal content with the HTML. What's wrong with you might ask? Well the problem with this is that we really want to preserve the pretty modal title and close buttons but there doesn't seem to be a real way of dynamically writing this into the script other than performing a hack to the modal.js functionality.

OK so how do we achieve this? You want simple, think simple. If we need to have the modal title and close buttons, do they need to be dynamic? Probably not. In that case why not add them into the remote page which we are calling?

For example the remote HTML from that basic modal looks a little like this:

<p>Bug Reported on {date}</p>
<p>Platform: {app system}</p>
<p>Reported by: {user}</p>
<p>Description:</p>
<p>{details}</p>


but if we wrapped the modal attributes around that we would have:

<div class="modal-content">
      <div class="modal-header">
        <button type="button" class="close" data-dismiss="modal" aria-hidden="true">&times;</button>
        <h4 class="modal-title">Modal title</h4>
      </div>
      <div class="modal-body">
       <p>Bug Reported on {date}</p>
<p>Platform: {app system}</p>
<p>Reported by: {user}</p>
<p>Description:</p>
<p>{details}</p>

      </div>
      <div class="modal-footer">
        <button type="button" class="btn btn-primary" data-dismiss="modal">Close</button>
      </div>
    </div><!-- /.modal-content -->

So ultimately we see that the content we draw into the modal dialog can have the nice features without too much of the hard work!

For more examples of our web development work, drop us a note or get in touch through our website.

SMALL AMEND:

Noticed that once called the modal will keep the first call's data in the modal content div so to get the modal dialog to dump the central body of the remote content each time it's closed, it's best to force a little data removal, thus:

 <script>
 $(function(){
   $('#myModal').on('hidden.bs.modal', function() {
    $(this).removeData('bs.modal');
});
   });
 </script>

All better now!

Sunday 11 May 2014

500 error with osTicket installation on Parallels Plesk panel

We use Parallels Plesk and occasionally have need of some of (generally) useful bundled applications which come with the Linux flavour of Plesk.

One the funky application bundles we've come across is osTicket. a neat little ticketing system provided by the lovely people over at www.osticket.com. This Open Source system seems to tick all the boxes as far as our needs were concerned for an independent project and seemed to be a doddle when reading through the installation of the version we wanted (v1.8.1.2) so we took a look at the local catalogue version which came with Plesk and thought - that'll do us.

The thing about the bundled applications with Plesk is that they do somewhat "take care" of the installation which bugged me a little bit as I wanted to have little more control over the installation.

So of I headed to get the local version of the catalog (APS) application and started the install. (btw this assumes you've set up the subscription into Plesk and have a domain mapped etc)

 
so clicking on the applications tab showed me (after a search for ticket) that is was in fact available (plesk 11.5 shown) so I adopted the "I want control" install option by clicking the small side arrow and choosing install (custom). Then I was presented with a number of dialog screens including one for additional options which allowed me to add in a database I had already created (there's no problem with allowing it to add it for you, I just had one myself)
 
The installation went fine after that until....I tried to browse to the live installation and got the dreaded 500 Server error. Now these errors, with the white page from hell which tells you absolutely nothing about the error which has caused the application to die, you need to open the error log file and see what that tells you.
 
In Plesk the error logs are location in the domain's file system in a logs folder thus:
which are accessible through FTP. Downloading the error log file and opening it in Notepad ++ I saw this:
 

I fixed the error about premature end of script which was some erroneous white space at the end of an include file (not of my doing) and then then focussed on the mod_fcgid warning. This was quite an odd one as it pointed to the FastCGI module not behaving itself. (Having seen these before I've seen the issue fixed with a hosting settings selection so tried it out)
 
In the Plesk Hosting Settings for the domain (Websites & Domains > {domain} > Hosting Settings)

clicking on the hosting settings will let you make any minor configuration changes to the subscription:
 
The thing to try out is changing the FastCGI option back to the CGI application type and then saving this configuration back into the subscription. I've found that normally this doesn't need any server restarts or the like and lo! the change worked - I refreshed the page for the 500 Server Error and the application started behaving as it should.
 
 
 
Hope this helps some other poor soul who has been pulling their hair out trying to configure osTicket - it looks like a top system just installing it can be a total pain!
 

Wednesday 9 April 2014

The Queens Hotel a family Hotel inverkeithing, Fife

Sometimes we all need a place where we can gather, be it for a joyous celebration of the birth of a new member to our family or to congratulate a relative for graduation, engagement or that "special" birthday. We've all looked at venues for such events and, frankly, baulked at the costs for staging a get together when sometimes all we want is a big enough room to house everyone and a finger buffet and bar so the family or extended group of friends may be allowed to mingle and reacquaint each other, particularly if your family doesn't have time or space to afford such events.

Enter the Queens Hotel in Inverkeithing, Fife. This hotel offers the perfect solution for a small meeting through to extended family get togethers, with capacity for around 50 people in its largest function room.

The Queens Hotel prides itself on being a family hotel serving the local community for family needs - it isn't "grand" in the true sense, nor will they charge "grand" prices, but the family who run the hotel recognise the need for a family-friendly venue for getting together. They cater for post-funeral teas and buffets which allows you to time to reflect without the worry of organising catering, venues and drinks - the team at the Queens Hotel are very professional when it comes to this.

So there you have it, small bit of praise for a small family hotel which punches above its weight, run by a dedicated family team for local families in Fife - get in touch with them if you want a cost-effective family venue for your function, whatever it may be.

Monday 17 March 2014

Is it too late to ask?

OK, so this post is called "is it too late?" It's been spawned from one of my newest findings - how long can it be left when someone's referring to an acronym to ask wtf it stands for?

I often deal with IT companies and Financial Directors (FCO or FD) who are particularly prone to using acrnoyms whenever they can - either it makes them look good or they genuinely are using acronyms to save time (LDAP, SBS, VCIO and more latterly DPA and PCI DSS). The thing about these is that because they are quite short they can be misinterpreted as incorrect full names. My example is DPA (Data Protection Act) which must be a royal pain to long serving organisations such as the Dudley Performing Arts or David Powell Associates, both who may well have been around longer than the Information Commissioners office.

My real concern is a very personal one, known to me as the Golden Minute - when you're in a meeting with new clients who want to impress you with their knowledge of the industry (normally web design and development which has started employing its own set of acronyms, don't get me started on LAMP, MVC or the exciting ones of CRUD, CRON, AJAX or CVS/SVN - more at http://1cm.me/EWBZz ) and then start mixing up their acronyms and creating their own versions which make little or no sense. What would you do? Normally I would interrupt them and correct them gently but if they're in full flow it can be a little uncomfortable to do this so you then adopt the professional approach of "storing" the acronym and then try NEVER to use it when delivering your reply.

No, the bigger issue is how long do you leave it before you can't ask what the heck a WRPM is, I've determined that the lower on the keyboard the acronyms letters exist, the less chance you have of guessing the damn thing. It can be quite an interesting and occasionally humourous game trying to "guess" what the acronym stands for. Tread carefully Reader as you have entered the Golden Minute. What happens now is that if you don't raise your hand and make it clear you want to know what WRPM stands for (it's nothing to do with Linux Distribution either, nice guess!) you have about 2-3 minutes to sit it out and try to guess what it means before you reach the metaphorical Acronym Rubicon - the point of no return. If you don't 'fess up and ask what it stands for, you can't now ask, all credibility will be lost and you will lose boardroom face, no you will need to struggle on manfully and then proceed to dodge all WRPM-related questions. Heaven forbid someone from the other side of the table asks what you think about the issues "with the context of WRPM-linked progress". You will be unable to ask any questions about the acronym now. The worst thing will be now trying to guess frantically around what the full term should be as you will look like a complete numpty.

My father was a soldier training cadets in the radio phonetic alphabet (alpha, bravo etc) and asked one young guy to spell his surname phoentically (Lawson in case you were wondering). So the poor guy did exactly what you shouldn't do, he guessed. The correct phonetical pronunication would be Lima-Alpha-Whisky-Sierra-Oscar-November, clearly the cadet had glanced over this chapter but taken very little on and proceeded to announce, "Sure Sarge, it's...erm.....Lima..?...Alpha....Whisky...(long pause, click of the fingers and boldly announces)...September, October, November...!" Short spell washing some tanks gave him time to think it over...

So the conclusion of this tale is for heavens sake, if you aren't sure what an acronym stands for, it's not a failing to ask - it'll save you time in the long run...

TTFN

Thursday 6 March 2014

Time savings through auto-posting to social media

If you have spent hours copying messages from one website into another and then having to push the information across to the social media networks, we may have the solution!

Twitter and Facebook are obviously keen to make sure people "stick" with them and create APIs (application programming interfaces) in many scripting languages which can assist with getting your news into their frameworks. We have simply harnessed this in a meaningful way through our CMS (content management system) which will make the whole process straighforward.

For automatic news updates to social media networks, drop us a note and we'll share the love!