Subscribe by Email


Showing posts with label External vendor. Show all posts
Showing posts with label External vendor. Show all posts

Monday, July 29, 2013

Working with vendors: Asking for a weekly status report

When the team works with vendors, there is always the element of doubt regarding the capabilities of the team with the vendor. In many cases, this may not be because the capabilities of the team with the vendor is any less, but because every product or project is different from the others, and it takes time for the team with the vendor to achieve the same level as the core team. However, this may not always happen; there may not be enough time for the vendor team to come anywhere close to the same level, and this time may not be available during the course of the project. This is one case, but there may be other cases where the coordinator from the vendor side may not be so competent, or there may be other reasons which is causing some sort of problems in terms of the client team feeling that there is some problem with the way that the vendor team is executing the project.
A number of these problems arise because of coordination and communication issues, and it is important that such matters be resolved; there should be a communication protocol setup to ensure that such matters don't cause conflict between the teams. There are several methods to have a regular communication process between these teams:
- Senior leads from both teams should setup a regular meeting for discussing issues (in my experience, this was a once in a week meeting that could be cancelled if there were no issues - this meeting was a big help to quickly reach conclusion on some meetings)
- A regular status meeting between the managers of both teams (such a meeting ensured that issues that were getting escalated were discussed and action items decided on how to resolve such meetings; in my experience, this meeting was also a weekly meeting that could be cancelled if required)
- The simplest way that we devised to highlight current status, ongoing items and ongoing issues was through a weekly status report. We discussed this with the managers and leads of the vendor teams, and then figured out a format which covered all these status items. For example, if there was an issue that was needed to be highlighted from the vendor team, they would put this in the report along with the other items, and this report was circulated to the entire team. This ensured that there was knowledge of what the vendor team was doing, what were the next items on their schedule and what were some of the major issues that they were facing. It also caused team members to flag issues where they had a different understanding from what the vendor had communicated, and quickly led to a resolution of issues.
We had asked the vendor team to ensure that this report was available every Monday afternoon, which also covered the entire items from the previous week, and on the odd occasion where team members were working over the weekend, these items were also incorporated in the report. A side benefit was that these reports conveyed an impression of the amount of work being done by the vendor teams, which was a subjective cross-check during the billing process.


Monday, June 24, 2013

Sharing of test data with vendors along with test cases ..

In several previous posts, I have been outlining some details with respect to interaction with vendors, for example - (Outlining focus areas of testing, Process for answering queries from vendors, Extracting relevant test cases); all of these outline some areas of interaction with the vendors. Working with vendors cannot be just one single area, there is a lot related to information regarding processes, tools to be used, focus areas to be decided, and so on. Yet another area of interaction between the vendors and the core testing team is the data to be used for the various areas of testing.
Having testing data is very important. You may have all the required test cases for doing a complete feature testing of the application, but without the test data, all this does not mean anything. For example, if you are testing a video application, then you need to be able to test for the different video formats that the application would support, and there are a large number of such video formats that are needed to be tested (unless you are in the video domain, you would not really believe the number of formats that exist because of the large number of companies that do something for video). For each such different type of video format, you would need to have the required data in this regard - multiple number of video files in each format (as another example, you could have a video file that is just a collection of images, another without audio, and yet another that has a combination of video and audio - there could be a defect that is only found if the video file does not have any audio - we once found a defect like this and it was a big pain to figure out what the problem was; after this, we had to add files with and without audio to our testing matrix).
And video files are large. The total size of the video test data that we used for complete testing was more than 50 GB in size, and it was guarded with great care. Because of the sheer size of the test data, we had decided not to put this test data in the source safe we used (because the data backup of the source safe meant that the guys running the source safe were not happy over backing up so much data that was binary rather than being coding text files) and hence had made multiple disk copies of the this test data, and there was some amount of effort required to ensure that there was a master copy of the data and all copies were synchronized to this copy.
Are you getting an idea of the problem we faced when we added a vendor testing team to this matrix. We had to tell them the focus area of testing, we had to prepare the extraction of the required test cases and make sure that these made sense for somebody wanting to do the testing who did not have the same amount of experience as the core team, and then we had to also pass on the large test data to the vendor while linking this test data to the test cases. Even though we had a fast connection to the vendor, there were some permission requests that also had to be processed since some of the test data that we had was from the vendors with no permissions to pass on, and hence every time we needed to pass them onto a vendor, we needed permission (no monetary problem, just the paperwork and time required), and since the test data was changing with new video cameras entering into the market, there were synchronization problems. And in an unusual case, we found that a particular set of high end video files required a very high end machine and this information was not captured properly. So when the vendor tries to use those files, things were not working.
The challenges may differ depending of the type of test data, but there is a need to ensure that there is a strategy to prepare for the passing on the required test data to the vendor.


Saturday, June 22, 2013

Working with vendors: Telling them the points of focus for the particular version ..

In some of the previous posts, I have been talking about the issues and processes involved in ensuring that the interaction with the vendors (especially when they are involved in testing) is fine. In the last post (Working with vendors to ensure that tools are synchronized), I talked about how the team should be spending time on defining a process for transferring knowledge about the tools used by the team during testing. Now, this is easy for tools that are in generic usage, such as tools for Defect Management, Source Safe, or for capturing the details of Test Plans and Test cases. However, this is more complicated when the tools are specialized such as Fiddler, and other such tools that take time for vendors to learn more.
In this line of thought, there are several other processes that the team needs to transfer to the vendor team and do this in a systematic way to increase the level of efficiency. Some of the processes that the team might need to convey to the vendor team are:
- Certain specific processes such as the dealing with defects that have existed for some time. Consider the case of a defect that has been present in the product for more than one version, and the team has decided that they do not want to make the fix. Similarly, there may be cases where there is a difference of opinion with respect to the functionality, but anybody who needs to do this testing will be able to log a defect. In such cases, the team members know about these defects and do not try to file them, but somebody who is new to that area such as a vendor team member will not know, and will file defects in such areas. It is for the product team to ensure that they have transferred instructions regarding how to handle these defects to the vendor. If this does not happen, there will be wastage of time when the vendors actually do these testing. I saw such a case whereby there were some new members to the vendor team and there was a delay in passing on such information, and suddenly there was a buzz in the team about the vendor finding a number of defects that the team members had not found. There was some pressure on the testing team, but then it was found that most of these defects were of the types that the team already knew about, and there was a sigh of relief in the testing team (they would have come under a lot of pressure if the vendors found a number of defects that the core testing team was not able to find).
- Similarly, there are certain points of focus for every release. Or it could be the case where this is a patch or dot release that is focusing on a certain area, and it is incumbent on the product team to pass on the application focus onto the vendor team. This is needed to ensure that the testing done by the vendor team focuses on the relevant area instead of spending time on other areas which are not relevant. The team needs to decide on the best way to do this, one way to do this is to take the application where test plans and test cases are captured and create a plan that shows the vendor the focus areas where testing needs to be done. However, this pre-supposes that the test cases are written in a way that a subset can be used by the vendors and would make sense.


Working with vendors: Ensuring that tools are synchronized ..

This is part of a series of discussions on how to work with people outside the software development core team, which in most cases are typically vendors. You would be working with vendors for a variety of reasons:
- Part of the testing needed is done by the vendors
- There is a sudden work to be done besides the core work, and the additional work testing is needed to be done by vendors
- The product is being localized into many languages and the testing in various languages is being done by vendors
- Multiple other reasons such as these, all of which raise the need for vendors to take part for testing.

Now, when you have vendors working with the core team for testing purposes, you will be lucky if there is a continuity in terms of the personnel from the vendor side over the years. Our experience has been that unless we are able to provide business to the vendors the year round, people on the vendor side move onto different projects and would not be so easily available the next time when there is a need for the vendor. In addition, there are times when there is a sudden need for testing, and at that time, testers from the vendor side who have experience in the product would not be available.
In previous posts (Process setting, Kickoff and knowledge transfer), I have already been talking about some processes involved in working with vendors who assist in testing, and how to ensure that getting them trained is done as easily as possible. However, there is another area where there is a need for ensuring that knowledge transition happens efficiently, and that is in the area of tools that are used for the process of preparing the data as well as doing the testing.
Applications such as Defect Management systems and source repository systems are easy for vendors to understand, given that even if the vendors have been using a different tool for these functions, the basic concept of using these tools does not change too much, and hence it should be fairly easy for them to pick up a new tool.
However, it gets problematic when the team is using specialized tools, and there is a high chance that the personnel from the vendor site would not have the experience in using such tools. We had this in a previous project, where we were using specific tools such as 'Fiddler' for testing out the flow between the desktop application and an online application. When we spoke to the vendor team around the start of the project, they had no experience in a similar tool, and yet it was necessary for them to do the testing (we could not do the testing of this tool in all the areas where it was the vendor responsibility, and hence it was important for the vendor to learn this tool). So, finally we had to run a project whereby we did a few rounds of web conferencing with the vendor and explained how to use this and other such tools. However, it would have been more efficient to have already prepared such a document for the various tools that the teams use, along with something like a self-learning document.

Read more about working with vendors in the next post (Processes when dealing with vendors)


Thursday, June 20, 2013

Working with vendors: Setting up a process for queries and the back and forth for these queries

I had been working with a development team for a long time, and one of the most frustrating items for the team was about working with people external to the team. The team had been working on multiple versions of the product for a long time, and had a lot of experience on the product (many members of the team had been working on different parts of the product for more than 5 years) and hence had a level of knowledge in some areas that was superior to the product manager (who also had a lot of experience in the product as well). However, the team needs to work with a large number of people who are not so experience. This happens with a number of teams. Typically, teams work with people who do localization of the product, teams work with teams who do documentation of the product, in many cases, some amount of testing is also done by external team.
In such cases, the biggest problem is that the external teams will NOT have the same amount of experience with the product as the product team has, and in many cases, the external team person working on the product has very little or no experience with the product. In such cases, if there is some kind of fast learning program that enables the external team member to learn about the product, it is beneficial, but in some cases, such a help is not possible (primarily because it takes effort to produce such help, and the product may also be changing fast between versions in which this help may need to be updated fairly quickly).
But, consider that whatever be the level of help available, the external team member will take time to come upto some kind of acceptable level, and they will also have a lot of queries on this front. Some queries would be available if there is some kind of FAQ available for the product, but typically FAQ's cover the product as a whole, and may not be available when some external tester starts getting into a feature in detail. In such a case, you will have people who have queries and things will move faster if these queries are settled quickly.
And this is where the frustration comes in. Unless there is some kind of process available for such queries, the team member can feel that these queries come up on a regular basis or get repeated (in fact, when talking to team members, this feeling has been projected again and again and there are repeated request to ensure that these get resolved).
We have tried multiple measures to ensure some kind of process for trapping all these queries, especially with regard to ensuring that the queries are all categorized as per the different features (and this ensures that a database was built up with regard to queries that these external vendors have). First, we tries using a MS-Excel document for capturing these, but then we started running into concurrency problems about exchanging the document back and forth, and who maintains the master copy of the document. And, given that the team was very comfortable with the defect management software and the vendors would also have to start using this software, we laid down a process whereby all the queries were logged as defects against the specific feature where the query was (and with a keyword that identified it as a query from a vendor), these defects were assigned to the respective feature person, and then the query would be responded to from the respective person, and the vendors were also told to ensure that they went through these defects before they raised a query.


Facebook activity