Solar System Orrery – HTML5 Canvas

Orrery Zoom

For quite a while I’ve been trying to get around to arranging some web hosting and putting my solar system Orrery online for people to access, I’m pleased to say I’ve finally got around to doing it.

(Click here to go to the Interactive Orrery)

The project was part of the 2D Graphics module course work for my Computer Science degree. It’s written in Javascript and utilizes the powerful HTML5 canvas for rendering.

It’s not an accurate scientific representation, however the planets distances are to scale in relation to each other (not in relation to the sun) and the frequency each planet completes a full orbit (year) is also accurate to real life. There are two orbit modes ‘circular’ and ‘elliptical’ and also two simulation modes where acceleration and velocity is calculated based on the mass each object and thus the force of gravity. One simulation mode keeps the Sun centered while the planets orbit around, the second mode allows the sun to be affected by it’s orbiting bodies.

elliptical

It’s really a bit of fun and you can create new planets of enormous size by simply holding down your mouse on the simulation until your happy with the size and let go and watch how all orbiting bodies are affected. You can also flick the planet when your holding it at the same time of release to set its starting velocity (seems to work much better in chrome then IE). I also highly recommend running it in full-screen mode by pressing ‘W’ if you have a reasonable spec system.

Another cool thing is the zoom feature, if you pause the program via ‘P’ you can scroll around with the cursor keys and take a look at some of the relatively hi-res images I used for each planet. The Earth and orbiting Moon is pretty cool to zoom right into as pictured above.

Detailed instructions are available on the page. Please check it out here and have a play around: www.alexrodgers.co.uk/orrery

simulation

Advertisements

Exchange Reports Project Overview

During this summer, in-between semesters I was fortunate enough to get a software development job for a local company just 10 minutes walk from my door. The project was to produce an ‘Exchange Reports’ system that would provide email messaging statistics exactly to the customers specification. The system would be automated so that after reports were designed, they would be generated programmatically by a service and emailed to any recipients that had been setup to receive each report. The solution was to be comprised of 3 distinct programs that would need to be developed along with configuration tools to setup the non-GUI processes in the solution (namely the services).

I have produced the following diagram to demonstrate the solutions processes, ( Click to enlarge):

The design was in place when I started and an existing code-base was also present, but still required the vast majority of the functionality to be added. It was the first time having worked professionally as a software engineer and therefore also the first time getting to grips with existing code made by developers no longer around. More so, understanding the solutions technical proposal well enough to execute exactly what the customer and my employer wanted. I think working in IT professionally for a lot of years certainly helped me get into a comfortable stride after an initial information overload when taking on solely what was a surprisingly large but beneficial technical project compared to what I had envisioned. Being thrown into the deep end is probably the fastest way you can improve and I feel that above all, I have taken a lot from this experience which will prove valuable in the future. I’m very pleased with the outcome and successfully got all the core functionality in and finished in the time frame that was assigned. I whole heartily would encourage students thinking of getting professional experience to go for it, ideally with an established company from which you can learn a great deal. Having experienced developers around to run things by is great way to improve.

Now onto the technical details. The project was coded in C# and used WinForms for initial testing of processes and later for the configuration programs. I used a set of third-party .NET development tools from ‘DevExpress’ that proved to be fantastic and a massive boon to those wanting to create quick and great looking UI’s with reporting functionality. SQL Server provided the relational database functionality, an experience I found very positive and very much enjoyed the power of Query Language when it came to manipulating data via .NET data tables, data adapters, table joins or just simple direct commands.

Using the diagram as a reference, I’ll briefly go through each process in the solution for A) those interested in such things and B) future reference for myself while it’s still fresh in my mind because i’ll likely forget much of how the system works after a few months of 3D graphics programming and Uni coursework :P.

Exchange Message Logs: 

In Exchange 2010 Message Tracking logs can be enabled quite simply and provide a wealth of information that can be used for analysis and reporting if so desired. They come in the form of comma delimited log files that can be opened simply with a text editor. They have been around a lot of years and in the past during IT support work I have found myself looking at them from time to time to diagnose various issues. This time I’d be using them as the source of data for a whole reporting system. The customer was a large international company and to give an example from just one Exchange system they were producing 40 MB-worth of these messaging logs each day. With these being effectively just text files that’s an awful lot of email data to deal with.

Processing Service: 

The first of 3 core components of the solution, the Processing Service as the name suggests is an install-able Windows Service that resides on a server with access to the Exchange Messaging log files. The service is coded to run daily at a specified time and it’s purpose is comprised of 5 stages:

1. Connect to the Exchange server and retrieve a list of users from the Global Address List (GAL). This is done using a third-party Outlook library called ‘Redemption’ that enables this information to be extracted and then check it for any changes to existing users and/or any new users. The users are placed in a table on the SQL database server and will be used later to provide full name and department information for each email message we store.

2. Next, each Exchange Message log is individually parsed and useful messaging information is extracted and stored into various tables on the database server. Parsed log file names are kept track of in the database  to prevent reading logs more than once.

3. Any message forwards or replies are identified and tallied up.

4. A separate Summary table on the database is populated with data processed from the prior mentioned message tables. This table is what the reports will look at to generate data. Various calculations are made such as time difference between an email being received and then forwarded or replied to gauge estimates of response times being just one example; a whole plethora of fields are populated in this table, much more than could comfortably fit on a single report. Due to this large amount of potentially desirable data we later allow the user to select which fields they want from the Summary table in the ‘Report Manager’ if they wish to create a custom report or alternatively and more typically, they use predefined database ‘Views’ that have been created for them based on the customers specification which allows them to access only the data they need. Database Views are a really neat feature.

5. The databases Messaging tables are scoured for old records beyond a threshold period and deleted. This maintenance is essential to prevent table sizes growing too large. Their associated Summary data that has been generated is still kept however but I added functionality to archive this by serializing this data off and deleting it from the database if required.

Report Manager:

Initially we had thought to utilise DevExpress’s ‘Data Grid’ object controls in a custom Form application but we decided that the appearance of the reports that were generated from this were not satisfactory. This turned out to be a good design decision since we later discovered DevExpress has remarkable reporting controls that allow very powerful design and presentation features that completely overshadowed that of the Data Grids. After some migrating of code from the old ‘Report Manager’ program and having to spend a day or two researching and familiarising myself with the DevExpress API I had a great looking new application that the customer will be using to design and manage the reports.

Report Manager program

Report Manager program

The Report Manager allows you to design every aspect of a report through an intuitive drag and drop interface. Images and various graphics can also be added to beautify the design, though that wasn’t something I did nor had the time to attempt! The data objects can be arranged as desired and the ‘data source’ information for the report is saved along with it’s design layout via a neat serialization function inherent to the ‘XtraReport’ object in the DevExpress library which is then stored in a reports table on the database server for later loading or building. You can also generate the report on-the-fly and export it into various formats such as PDF or simply print it. Another neat built-in feature is the ability to issue SQL query commands using a user-friendly filter for non-developers in the report designer which is then stored along with the layout, thus the user designing the report has absolute control over the data i.e a quick filter based on Department being “Customer Services” would return only that related message data without me needing to code in some method to do this manually like was the case when using the Data Grids.

In the top left you’ll see specific icons that provide the necessary plumbing for the database server. ‘Save’, ‘Save As’ and ‘Load’ respectively writes the serialized report layout to the database, creates a new record with said layout or loads an existing saved report from the database into the designer. Loading is achieved by retrieving the list of report records stored in the reports table and placing it into a Data Grid control on a form where you can select a report to load or delete. The ‘Recipients’ button brings up the interface to manage adding users who want to receive the report by email, this retrieves the user data imported by the Processing Service and populates a control that allows you to search through and select a user or manually type a name and email address to add a custom recipient. Additionally, upon adding a recipient to the report you must select whether they wish to receive the report on a daily, weekly or monthly basis. This information is then stored in the aptly named recipient table and then relates to the reports via a reportID field.

Report Service:

Nearly there (if you’ve made it this far well done), the last piece in the solution is another Windows Service called the ‘Report Service’. This program sits and waits to run as per a schedule that can be determined by a configuration app that i’ll mention shortly. Like the Processing Service, as part of it’s logic, it needs to check if it’s the right time of the day to execute the program, of course the service continuously polls itself every few minutes to see if this is the case. Upon running it looks to see if it’s the right day for daily reports, day of week for weekly reports, or day of month for the (you guessed it) monthly reports. If it is, it then it runs and grabs the ‘joined’ data from the reports and recipient tables and proceeds to build each report and fire them out as PDF email attachments to the associated recipients. It makes a final note of the last time it ran to prevent it repeatedly running on each valid day.

Configuration Tools:

Two configuration apps were made, one for the Processing Service and one for the Report Service. These two services have no interfaces since they run silently in the background, so I provided a method via an XML settings file and the two apps to store a variety of important data such as SQL connection strings, server authentication details (encrypted) and additionally also through the need to provide certain manual debugging options that may need to be executed as well as providing an interface to set both services run times and the report delivery schedule.

Screens below (click to enlarge):

So that’s the solution start to finish, depending on time I’m told it’s possible it could be turned into a product at some point which would be great since other customers could potentially benefit from it too.

The great thing about a creative industry like programming, whether business or games, is that you’re ultimately creating a product for someone to use. It’s nice to know people somewhere will be getting use and function out of something you have made and just one reason why I’ve thoroughly enjoyed working on the project. I’ve learned a lot from my colleagues while working on it and hope to work with them again. You also get a taste for real life professional development and how it differs in various ways to academic teachings, which although are very logical and sensible are also idealistic (and rightly so) but in the real-world when time is money and you need to turn around projects to sustain the ebb and flow of business, you have to do things in a realistic fashion that might mean cutting some corners when it comes to programming or software design disciplines. I always try my best to write as clean code as possible and this was no exception but ultimately you need to the get the project done first and foremost and it’s interesting how that can alter the way software development pans out with regards perhaps to niceties like extensive documentation, ‘Use Case’ diagrams and robust unit testing potentially falling to to the wayside in favor of a more speedy short-term turn around. Certainly I imagine, larger businesses can afford to manage these extra processes to great effect, but for small teams of developers it’s not always realistic, which I can now understand.

Hypermorph Wins Three Thing Game Competition

So it’s been a frantic couple of weeks, plenty of course-work to do and last weekend was the much anticipated Three Thing Game competition. For anyone not in the know this is held each semester at Hull University and challenges teams to come up with a game based around three auctioned words per team. Judges then score based on the games relevance to the words and the quality/fun of the game. The competition involves a marathon 24 hour programming session to get your game finished on the day. This one was the biggest yet with 39 teams competing. We really couldn’t have asked for better “Things” because a combination of good bidding and luck meant we came out with “Flying”, “Tank” and “Bombs”. Considering another team got “Teddy bear”,  “Deodorant”  and “Pop Tart” I think we did ok!

Last year we came second with Shear Carnage and and I can say that honestly this year, we really really wanted to win it. This was evident to myself just by the focus we had this year and when the day of the competition came, I think I probably left my seat half a dozen times in the whole 24 hours! In hindsight we probably took it far too seriously and as a result I think it sacrificed a lot of the enjoyment of the competition and resulted in some contention regarding ideas that seemed inevitable considering vested interests and no one leader within the team. I think on a personal note, much was learnt regarding team work and there are aspects of the planning and design process I would do differently next time. Luckily it all turned out worth it in the end and so it’s very hard to regret any decisions, but this was by no means a painless endeavour!

Me on the right, Russ in the middle, John on the left. Lee Stott at the back.

So to the game, Hypermorph is a retro-style side scrolling shooter that takes me back to my childhood days, playing classics such as Xenon 2, R-Type and Menace on the Amiga. Back then the shoot’em’up was a staple video game genre and was hugely popular, now only since the mobile platforms have taken off is the genre again feasible because it’s the perfect style of game to have a quick blast on when wanting to pass a little bit of time. The  thing that’s pretty novel in Hypermorph is the ability for the player to switch between two different forms, a spaceship and a hover tank by simply tapping the screen. We made the game using XNA (C#) for the Windows Phone 7 and coded everything ourselves (no third party libraries).

I produced the art for the game and managing both the art and doing a lot of the programming was a challenge in itself on the day, resulting in most of the art being done in the last few hours. I had a good idea in my head what the game would look like when we were bouncing the initial idea around, however my regret was that I didn’t produce any concept art for it sooner to put the rest of the team at ease; for a long time I think we were left with our own ideas for how the game would look but once I came up with the first concept drawing for the ship, the team were all in favour to my relief!

We had decided to make the game quite dark and moody but with bright weapon and explosion effects to make them really stand out. Additionally, we wanted to make the controls as hands off as possible. We learned from Shear Carnage that using touch too frequently can result in obscuring a lot of the screen so we instead went for a tilt based movement for the player and a single touch to morph between Tank and Spaceship. Importantly we set it to auto-fire constantly since you soon realise that in this genre there’s never a time you don’t want to be firing.

One feature I’m really pleased we put in was the voice effects for powerups and various other things. It adds a lot to the immersion and again, really goes back to the genres roots.

Of course we have plans to get Hypermorph out on both the WP7 and Windows 8 market ASAP but uni coursework is currently being prioritised. At the competition was Lee Stott from Microsoft and guys from the Monogame team. Lee’s encouragement was inspiring and I’d also like to thank him and Microsoft for providing the cool prizes. The Monogame guys were brilliant and we spent a fair time chatting with them regarding getting our games ported to the various platforms, they even ported Shear Carnage and my Robocleaner game for us to show us how easy it is! (albeit there’s some coding required to get them ready for the marketplace).

Ultimately we are going to want to put in a few more levels, enemy types, weapons and powerups before getting it on the marketplace, but the good news is it will most certainly be free!

All in all it was overwhelming and the encouragement we have received from Lee Stott, Rob Miles and the MonoGame guys was great. Ultimately this is why I gave up a career in IT to get into the games industry, because there’s so much satisfaction in putting your heart and soul into producing a game and then seeing others get a lot enjoyment from it. Winning the Peoples Choice award as well as the judges award was the icing on the cake and I’d like to thank everyone who voted for us and gave us great feedback.

Stay tuned for more Hypermorph news soon…