Adding Content to the Skeleton, an Adapted Scrum Methodology

This week marks the beginning of adding real content to the skeleton site that I have been developing for Massachusetts HOSA. The content was provided to me in the form of a word document, with titles to differentiate between pages and sections. The first thing I did upon receiving this document was break the document down into individual user stories. I then created a Trello card for each story and placed the cards on the product backlog list. During this process, I was reading and building an understanding of the product owner’s vision for the website, as well as attempting to come up with potential acceptance criteria. I was also informally coming up with estimates for the amount of time and work that each story would take, but decided not to include these as part of the stories as it would offer minimal return on investment with me as the only developer. I looked at this process as a sort of modified Scrum Story Time meeting.

The next step was Sprint Planning, in which I plucked some of the story items that I saw as most important and placed them on the Sprint Backlog list. During my modified Story Time, I had already, for the most part, broken these stories down into small enough pieces to be worked on as tasks. This was the simplest way to handle breaking down the requirements, and is allowable because I’m the only one that is working on these tasks.

Once I felt that I had taken enough stories from the backlog to keep me busy for a week, I began work. This involved adding content in the form of text and images to existing blank pages on the skeleton website, as well as creating new pages as needed. Some of the story items involved formatting things like the footer and sidebar to achieve the desired effect. Throughout development, various questions arose.

Each time that I encountered something that I needed clarification on, I added a comment to the Trello card. For this reason (and many others), I really like using Trello cards to represent user stories and tasks. Using Trello also allowed me to attach images, checklists, tags, and mentions directed at specific users. I like the idea of keeping all of the relevant information about a specific user story in a single place rather than scattering it across email, in-person meetings, and Trello. I’ve been attempting to document any information passed through other communication channels into Trello so as to have a complete log of interactions.

I’m looking forward to getting into some more specific development on the site. Next week’s post will include a review and retrospective for this sprint.


Software Development as an Art

Hoover and Oshineye make some excellent points that I strongly agree with in the Craft Over Art design pattern in Apprenticeship Patterns. Although I had never thought about software development in terms of art, I found their discussion of art versus fine art to be interesting and well supported. While it is relatively simple to argue that software development is a craft, I would imagine that claiming that it is art would generate a bit more controversy. In my opinion, Hoover and Oshineye do an excellent job of supporting their claims that software development, by nature of being a craft, is also an art – but not a fine art. The distinction that they make between craft and fine art has to do with purpose. Because software development’s purpose is to make a useful product for customers, it can be seen as utilitarian in a sense. The purpose of producing fine art is purely for beauty.

The important caveat that Hoover and Oshineye introduce in the Craft Over Art pattern is that the craft of software development may produce something beautiful, but it must produce something useful. This may mean that craftsmen must choose utility over beauty when it is necessary. Considering the interests of the customer over personal interests and balancing conflicting demands is important in building and maintaining strong relationships with the customer.

This pattern applies in a couple of ways to my current undertakings. The importance of creating software that addresses real problems for real people is something that is somewhat new to me. Writing code for AMPATH Informatics is exciting because of the real-world significance that my contributions have. The other place where this pattern applies is in my independent study web development project for Massachusetts HOSA. While the customer-development relationship is a bit different in both of these cases, many of the same tips from the Craft Over Art pattern apply. While I may not be paid for my services, I still have expectations placed on me by the instructors and organizations involved in these projects. Understanding and carefully considering my responsibility to deliver a product that is first and foremost useful will help me to foster strong relationships with the collaborators on these projects.

Effective Data Migration Testing

When considering the term software quality assurance and testing, what comes to mind? For me, I think of developing test cases that exercise program functionality and aim to expose flaws in the implementation. In my mind, this type of testing comes mainly before a piece of software is released, and often occurs alongside development. After the product is released, the goals and focus of software quality assurance and testing change.

My views were challenged, however, when I recently came across an interesting new take on software testing. The post by Nandini and Gayathri titled “Data Migration Testing Tutorial: A Complete Guide” provides helpful advice and a process to follow when testing the migration of data. These experienced testers draw on their experiences to point out specific places in the migration of software where errors are likely to occur, and effective methods of exposing these flaws before they impact end-users and the reputation of the company.

The main point that Nandini and Gayathri stress is that there are three phases of testing in data migration. The first phase of testing is pre-migration testing, which occurs, as the name would suggest, before migration occurs. In this phase, the legacy state of the data is observed and provides a baseline to which the new system can than be compared to. During this phase, differences between the legacy application and the new application are also noted. Methods of dealing with these differences in implementation are developed and implemented, to ensure a smooth transmission of data.

The second phase is the migration testing phase, where a migration guide is followed to ensure that all of the necessary tasks are performed in order to accurately migrate the data from the legacy application to the new application. The first step of the phase is to create a backup of the data, which can be relied upon in case of disaster as a rollback point. Also during this phase metrics including downtime, migration time, time to complete n transfers, and other relevant information are recorded to later evaluate the success of the migration.

The final phase of data migration testing occurs post-migration. During this phase, many of the tests that are used can be automated in nature. These tests compare the data from the legacy application to the data in the new application, and alerts testers to any abnormalities or inconsistencies in the data. The tutorial lists 24 categories of post-migration tests that should be completed satisfactorily in order to say that migration was successful.

Reading this tutorial on data migration testing has certainly changed my views on what testing means. The actual definition seems much broader than what I would had thought in the past. Seeing testing from the perspective of migrating applications gave me insight on the capabilities of and responsibilities placed on software testers. If something in the migration does not go according to plan, it may be easy to place blame on the testers for not considering that case. I enjoyed reading about software testing from this new perspective and learning some of the most important things to consider when performing data migration testing.