Sunday, November 16, 2008

A look at my notes

85 of 365 - looking insideImage by paul+photos=moody via FlickrFor the purposes of my own interpretation of GTD and a couple of thoughts that were wanting to get into that process, I've looked at some of the notes I've been taking. Ouch! What is the point of a note without context. Even a date would be good. Although I would call it criminal to leave out the project.

Basically I've run through notes from 2006 through to now. The earliest notes are actually quite good in relating the topic and date. Just not in the reason for their existence. Of the nine months of notes from 2006, only about 10 days were worth keeping. That equates to a lot of pages that have at long last been tossed.

The more recent notes have definitely improved in terms of the relevance. Unfortunately it seems that is must have been intuitively obvious as to when they were written and against which project. I get to keep a lot more as reference, I'm just uncertain as to how to relate them to this world of references I'm creating for myself.

I now hope to take the following into account:

  1. Context
    1. Date of the event
    2. What the notes refer to
    3. Who else was there
  2. Relevance
    1. It must make sense
    2. It must have a purpose
Reblog this post [with Zemanta]

Wednesday, November 12, 2008

Predisposed to Pessimism?

The overall geometry of the universe is determ...Image via WikipediaIt would seem that there is an inherent tendency towards a negative outlook within the universe.

There are basically 3 final states for the universe that depend on the space-time curvature. Each state ultimately has a rather unappealing end state.

In a closed universe the curvature is such that the universe does not have escape velocity. There will come a time when the expansion reverses and the great collapse begins. This is known as the Big Crunch. It could be that the universe will Big Bang again after the crunch but it is likely that the replacement universe will not have the same initial conditions and thus will not be the same place we know.

An open universe will continue to expand through eternity. It is likely to slow but will never reach a point where it stops expanding.

A flat universe's expansion will slow until it eventually reaches an equilibrium point. This will be a stable position. It was to attain this ideal that lead to Einstein's "greatest blunder".

Ultimately though a flat and open universe will suffer the same fate although the open universe will reach it first. When stars die, they release heavy atoms into the universal stew. Heavy atoms are not as good a nuclear fuel as the lighter elements especially hydrogen. This means that as the universe ages, the fuel available for stars does not burn as furiously as in the earlier universe. There is therefore a reduction in the energy generated.

Another consideration is that the basic elements of an atom also have a shelf-life even if it is a pretty long one.

The long term picture is not all that great. In one case everything goes splat and in the other everything eventually disperses in a haze of subatomic particles. Not a happy place.

Reblog this post [with Zemanta]

Monday, September 29, 2008

Test Driven Test Development?

Example JTiger test caseImage via WikipediaWhile the equivalent of Test Driven Development (TDD) is a realistic possibility for automated test development, this is not what I'm considering here. Test Driven Test Development (TDTD) is the unrealistic proposition of trying to get the advantages of TDD for manual test cases.

TDD relies on the development of unit tests prior to the development of the code. Developing unit tests generates additional artifacts - setup, teardown, unit and integration tests. Mocking is also a serious possibility for extending the test capability.

The relationship to automated tests should thus be apparent since any decent automated test is scripted or coded. Unit testing the automated test is therefore possible - and would be more so if the tools and languages they were based on had intrinsic unit test capabilities.

But this is not about existential capabilities but rather extensionalism in the arena of the weird and useless :) How to map the artifacts:

  • Setup - preconditions
  • Teardown - post-conditions
  • Unit test - this would seem to be the issue
So what would a unit test be of a test case? There are some things that could be considered:
  • Completeness tests
  • Coverage metrics
  • Usefulness guesses
  • Estimates of it successfully finding issues
  • QA measurements
  • Ease of repeatability
So maybe TDTD is not quite so weird a concept as there are tests that can be performed against the test cases. Sort of makes one wonder why it seemed to be such a strange concept initially?
Enhanced by Zemanta

Tuesday, August 26, 2008

Performance Testing Confusion

Among the huge number of things that I have little knowledge of is performance testing. Interestingly, I find that I am in a position to view the development and execution of a performance test. I'll concede that the parties I have as the models are probably not of the highest caliber but then you use what you have. Anyway, these are some of the observations I've made:
  • It requires a special skill
  • Strategy documents are superior to check lists
  • Results don't need to be in context
  • Execution only needs to be outlined
  • Data should be someone else's problem
I've never yet encountered a document that could replace the convenience and ease of reference that can be accomplished by a check list where the information being tracked is growing and evolving at each step. Consequently, I disagree with this notion.

No contextualization of the results? Kidding... surely? A summary may be sufficient but then a summary contains a "summary" of the explicitly rendered results. Wait though. There are more graphs than text. That must count right? I mean each has a useful description like "the transactions per second graph shows the number of transactions per second"... epic.

Baseline, load and soak will be run. That is an execution plan? Sure. Short and to the point. What though will happen as time gets shorter and problems refuse to get resolved? Contingencies? Eerie.

Data obviously can't belong to someone else. Surreal.

Skills. My current view is... that I need more input here although its probably QED.

Sunday, August 24, 2008

Is male sex the weaker?

Eve (Temptation)Image via Wikipedia A potentially contentious thought as its quite contrary to commonly accepted ideas.

Males are commonly more heavily muscled and taller than their female counterparts. These differences mean that females are frequently regarded as being weaker than men. These arguments can be considered to be sexist and therefore of no consequence.

As illogical as the argument is, it is still taken as being fact. A point to dispute the assertion, is that females generally outlive men by 5 to 10 years (at least in western societies). Surely this indicates that the male is flawed or at least weaker?

There are low level differences between the sexes and these can account for the variation in lifespan. These factors are looked at by the scientific community. Put in perspective, the scientific variables point to fact that the sexes are equal except that factors that make up a male result in his body being worn out before hers.

There is another less attractive proposition. In the past childbirth was a pretty dangerous occupation for both the woman and the child. The large number of deaths would have had an impact on the surviving females. Charles Darwin pointed out that evolution was based on the concept of natural selection.

So has society in the past had an impact on the human species? Have we effectively practiced selective breeding by caring more for boy children in the past so as to ensure that they survived through to maturity? Has the predilection of the paternalistic society put males at a disadvantage?

Reblog this post [with Zemanta]

Sunday, August 3, 2008

JavaScript to Generate Data


JavaScript is a pretty powerful scripting language. Tied in to DHTML it provides a means of dynamically controlling a page. The control can result in something that could almost be called an application.

Having found that PHP had limitations for generating data, I decided to work on a version under JavaScript. I can't say that it's totally complete - doesn't do numbers, dates, or fixed length strings with a fill character - but it was some fun to complete.

Another good reason to use JavaScript, is that it can be used to both in an active web page, and as a standalone tool (through a browser) with little need to actually make any modifications. The supporting tools add extra value - JSMin to reduce the web version of the library and JSLint to verify that the code is correct.

While this tool works for its limited string options, there are number of opportunities for improvement. Even a little code prettifying would be good - let alone separating into object components for reusability.

Zemanta Pixie

Sunday, July 27, 2008

Getting out there

AMSTERDAM, NETHERLANDS - FEBRUARY 9: In this h...Image by Getty Images via DaylifeAfter much deliberation, I eventually completed an article targeted specifically for publication. Amazing to me StickyMinds allowed it through their filtering process. While it has been a while in coming, I have managed to get an article in a real location.

The idea for the article actually stemmed from a blog posting. The blog posting arose out of doing a little training of folks on QTP. A seemingly varies path but I'd have to say that it does help to clear things in your own mind by pointing them out to others.

The idea for another potential article now needs to resolve itself into a bit of action. The idea for this one arose out of the writing of the other - not so much a sequel but another on clarifying a concept in test automation.

In putting up a list of the things I've done, I see that there are only a few and they are pretty far apart. They also seem to be based on a theme. At least I've done a few things...

Zemanta Pixie

Thursday, July 17, 2008

Scripting Customizations in Quality Center

Photo of first computer bugImage via WikipediaHP's or ex-Mercury Interactive's Quality Center is a test management tool. This means it provides a central storage area for test requirements, test cases, test execution records and defects. Great huh? Well it might be but for the fact that testing is meant to identify as many bugs as possible within an application. The problem with all test management tools is that there is a focus on the management aspect with a loss to the target purpose of performing testing. This is a wonderfully general statement.

Today I was asked to look at a customization bug in QC. I sympathize with developers... how can a tester write a bug report without explaining the problem. I actually had input from four different people about the problem and in the end still couldn't tell anyone what the problem was. In the end I looked at the project and tried a few things and my best guess was that the field change rule on a new defect was not being activated while it was being activated on editing a defect.

Having at least found a problem... I then looked at the script. Having helped out before I knew that I was heading into a realm of badly coded hacked together copy-paste nightmares. It may have a VBScript backend but that is no excuse. Today's function was the worst I've seen. The entire block of code under the new defect customization was a copy of the field change customization. Besides the fact that the fields are blank and so changing the some entries to prettier strings is a futile exercise, all the code is hidden within a "one error resume next" block.

Herein lay the problem. The problem I uncovered was another instance of attempting to set a value in a field based on the entry in another field which is blank and has to be on a new defect and so is happily caught by the on field change customization. I caught the error by simply including the error description in a message box. Fixed the error and ran it again and received a different error. This second error was the annoying one. A half hour later with everything in the function commented out and an error reset command prior to the msgbox, I was still getting the error.

My conclusion is that the function customization in QC are not scripted through a clean interface as they trigger an object does not support that function error. So much for knowing whether or not the code has problems.

Zemanta Pixie

Wednesday, July 16, 2008

Information Transfer

It would seem that besides wanting to be free, it seems that information has trouble presenting itself in a way that can be easily assimilated. There are a lot of ways of presenting data:
  • Lists
  • Maps
  • Tables
  • Graphs
  • Diagrams
  • Paragraphs
The question is though what works best. The context is an obvious starting point. It does not make much sense to use a paragraph when noting which items need to be bought at the store. Lists may be a convenient way to capture a large amount of information, but do they enable easy information transfer?

It seems that the easiest means of passing information on to another person is to present to them in a standard form. A colloquial conversational type of document is more likely to transfer information than any of the list-based mediums.

Lists are hard. They take a lot of effort to process adequately and the longer the list the worse this gets. This is because there is no flow of ideas. My personal experience is that I skim lists and go back to them and treat them as independent entities. Lists cannot contain information that is critical to the view being expressed.

Tables may be an efficient way of consolidating information. The problem is that they can't be read. The association between the various cells is via the header of the column and the row. There is a discontinuity.

Information is transferred by the story it creates as it moves through the various stages. The critical feature of the story is the progress. I can't say for sure, but I'd recommend a growth in complexity of the topic as the story progresses. This builds on the information already imparted.

Monday, June 30, 2008

Quick Notes on Families

It seems that there have been a number of different family states between men and women through the ages and in different cultures.

The most common bond has been the conjugal family which consists of a man, woman and their children.

Another well known form of bonding is polygyny wherein a man has many concurrent wives. The practice of polygyny is quite rare and mostly practiced by the wealthy classes. Concubinage is fairly common among the upper or leader classes through history and is a form of polygyny.

A rarer form of bonding can also occur in that a woman may have many several concurrent husbands. This has at least been noted in Tibet among the Todas where all brothers marry the same woman to create a polyandry relationship.

The various bonding types are probably a reflection of the dominance of patriarchical societies as there have been very few matriarchcal societies although the Mosuo women are still in control within their corner of the world.

A seemingly disturbing but apparently common practice was wife sharing. This was the practice of sharing your wife with another man for a special occasion or simply for an overnight visitor. It seems that this can still be found in the modern world.

Wednesday, June 25, 2008

Testing Value

F-22 RaptorImage by James Gordon via FlickrI'm unclear as to what my actual interpretation of the value of testing is but I need to get some thoughts out.

Software testing does not have a measurable deliverable in the sense of a tangible product. It basically provides a potential of value. This potential can be accorded to the following:

  • An in depth understanding of the system which might not be available within any other development team
  • Insuring the expectation of the system's potential
  • Identifying additional uses of the system
You don't buy insurance for the return on investment but rather as a means of being able to cover yourself should the insured item become in some manner impaired or absent. It should thus be somewhat reasonable to see the cost of testing as being similar to the purchase of an insurance policy. The value insured, is where the question re-arises. An insured item has a price and it is reasonable to expect some sort of guarantee against the cost. In insurance the guarantee is only as good as the company holding the policy. Would anyone put a guarantee on the state of the software after it has been through the testing cycle?

To come back to the insured value. To obtain an insured value, one could look at the business requirements. A conventional technique is to prioritize the functions. The prioritization process in itself will identify those areas that add the most critical to the business. It is likely that these processes are the ones that add the most value and thus need the most insurance (or testing). Using some weird weighting-to-tester cost to company figures it's intuitively obvious that a dollar value can be generated for the value the testers are there to insure.

From this scheme it can be noted that various supplementary metrics can be applied against the generated dollar amount. These metrics could provide business level feedback of the cost a bug represented had it not been found (using weighting to cost and fudged with severity). Possibly interesting to establish a basic formula for doing this - also if it might even be remotely useful outside of a theoretical obscurity (or just being hilarious).

Information on a system is a long term benefit that could be idealized only by the establishment of a software testing department with effective retention policies. Cross system or domain knowledge can be considered to the be an additional value brought in by test team. This would be a fuzzy value that equates to reduced time to idealization rather than a physical amount. It could also be interpreted as improving the quality of the test effort using past experience. I'm not sure that some scheme could be devised to measure this in terms of a dollar value...


Zemanta Pixie

Saturday, June 21, 2008

Hxaro and Sharing

A composite satellite image of southern Africa.Image via WikipediaHxaro is a cultural aspect of the San. It basically involves the creation of obligation as a means of insuring against the future.

Understanding the San way of life provides ethnographic insight into the Late Stone Age cultures in southern Africa. This means that a potential understanding of how people lived in the last 2000 years can be gained.

Hxaro is interesting - especially when considered in light of gift giving within the Roman Empire. The mechanics around hxaro, involve the manufacture of a gift which is then handed to a specifically chosen person. The gift is usually something decorative or practical - not food. The gift may be kept and used by the recipient or passed on to someone else. When a hxaro gift is passed on it forms a chain of responsibility with the subsequent recipients inheriting an obligation to the original giver. The partners in hxaro may allow the relationship to drop or may foster it. Dropping a hxaro partner becomes more difficult once a chain has been established.

In choosing a hxaro partner, the environmental factors are considered. Having a wide range of potential partners means that there is more scope when some difficulty arises within the environment. When the region cannot support a tribe or family group they manufacture gifts to take to their established hxaro partners in other regions. As the relationship is birectional, the hxaro process has the effect of sharing the risk.

Gifts in Roman times were primarily used as a means of gaining support for illicit practices and other general bribery. It's quite refreshing to come across a practice that is mutually beneficial as well as establishing an obligation for both partners.

Zemanta Pixie

Tuesday, June 17, 2008

Generating Data with PHP

Pretty much claim to have forgotten how to handle a piece a code to be almost immediately jogged by a short search. Anyway have completed the string generator in PHP. Possibly the most interesting point of the exercise is the strings that are created - some are just plain weird.



The weirdness is possibly the main attraction especially if one is not limited to boring strings like alphabetical and alphanumeric - although a quick random ASCII will show if there are any problems. I extended my initial release to include generating printable ASCII in addition to using the full ASCII encoding. This means that the string offering is only missing the unicode option.

I would say that just writing the generated string straight to the web-page, does not result in the prettiest of output options but it serves the purpose (at least for now).

Monday, June 16, 2008

Restart on PHP

After a few years I've picked up on PHP again. I'm reminded of how powerful a language it is - as well as how much of it I've forgotten.

My primary goal in this exercise is to be able to generate random values that can be used as test data. I started with a class for generating random strings (alphabetical, alphanumeric and ASCII) along with various formats (all capitals, all lower, sentence, title and random) as either of fixed length or of random length within a range.

This was relatively simple. The issue and where the memory hole exists is in processing input so that the values can be generated as needed. I remember processing the POST variables as well as doing input validation with error messaging. The solution is probably quite simple, I just need to finish this before moving on to the next bunch of ideas.

Thursday, June 12, 2008

Animation Fun

Pivot Stickfigure Animator can be a problem in that it illustrates a tendency for the macabre. However, my own misguided musings aside, its also extremely easy to use and a bunch of fun.

Monday, June 9, 2008

Data Generator

Data is probably the most important part of any application test.

Imaginative data is more likely to locate the bug than its more conventional friend. However, creating weird data can take a lot of time. To alleviate the strain, I've added my 2c to the field with a macro driven Excel spreadsheet.

There are drawbacks to the approach I've taken. The first of which is that it is scripted in Excel. The second is the number of codes that I needed in order to all the options that are included. The codes themselves aren't too complex, but there are enough to make remembering them a problem.

Thursday, June 5, 2008

Lifestyle change

A while ago I decided that things weren't going well. I was overweight and pretty unhealthy.

My sister and parents were pushing me towards to a low GI diet. Diets however are not something I believe in or want to deal with. The resolution to this was to make a lifestyle change.

A lifestyle change is more encompassing. For a start I not only decided to change what I was eating but also to increase my physical activity. As I have previously only enjoyed doing weight training as a form of exercise, this was an obvious choice to me. A single change would have been insufficient and so I incorporated some of the basic low GI principles within the program I decided upon.

After this time, I have trimmed down nicely and improved my physical ability as well as my overall health. Of the things I can attribute to leading to the biggest success, its being committed to reaching a goal and making a note of what I was doing in order to get there.


I'd say that writing down what you do and have done is the greatest motivator in keeping to the plan. There were some periods when I skipped writing down what I had eaten (although still recording the exercise routine) and in those periods I had the most trouble in keeping up with the overall plan and had the least progress towards the final objective.

Saturday, May 31, 2008

Metaphorical Description of Automated Testing

In the forest of software testing strategies there is a weeping willow known as functional automated testing and an oak known as performance testing. These relatively minor players in the woods provide support to the greater surrounding edifices and ensure their majesty against slight.

The willow has a great reach. With a feather-like touch the willow gently encourages the furtherance of the development of the citizens it touches. Its outstretched drooping branches touch a great many of the other forest denizens. A wispy encouragement to move into unchartered areas with the secure knowledge of the willow being near and ready to provide any support.

The majestic oak is a stalwart solitary figure. It provides a backbone of strength upon which the forest population can depend and use as a central core around which to develop and grow. From its strength comes the certitude for the future as well as the assurance that the present is adequately covered.

A typical forest is not solely composed of trees and this one is much the same. The mistletoe can be considered to be a parasitic lecherous growth sapping the very life from the tree on which it grows. In this case, however, it is not the essential essence of the woody-citizen that is being drained, but those things which imperil its beauty. As such the mistletoe test enablement tool magically transforms its victim into an ideal focused specimen with little superfluous baggage weighing down the lofty elegant idealisms to which it was born.

Willows, oaks and mistletoe cannot grow without a suitable environment. Knowing how, where, when and which to plant is of primal importance to a successful forest. This is the realm of the horticulturist – a specialist within the greater forest of software testing.

Sunday, May 11, 2008

Data Driven Frameworks

The notion of a data driven automation tests is in itself a notion of grandeur. It is a highly commendable practice however what makes an actual data driven script?

The basic notion of a data driven script is one in which the data has been separated from the code. In its simplest form, a data driven script can be considered to be a simple case of centralizing the data in variables. The variables can occur in a group within the script and can be modified at need and in a central location.

The above notion is much the same as a tool vendor's concept of a data driven script. How much extra benefit is achieved by simply moving the data variables to an external file? Simpler interface. Maybe. Sure. Huge numbers of data variances for the same script. Not entirely. An array would provide the same features and would probably reduce execution time as well. Essentially the point of using an external data file is thus the simple interface that the file tool might provide.

Lets preclude my disillusionment and implement a data driven script in the above manner. It works wonderfully. Along comes an interface change. The changes require that 3 fields be added, 2 edits change to lists and 1 field is removed. Lets look at what the changes entail:
  1. The data file is updated to include the new fields and remove the old fields. The new list entries are also updated to ensure that the correct items are selected
  2. The GUI/object reference map is updated with the new objects and to update the edits to lists
  3. The script code is updated to point to the additional fields. The function calls for the edits are changed to function calls for lists. The code for the removed field is commented out or deleted
But wait a bit. This is a data driven script. Why does the script code have to be updated? Surely a data-driven script precludes the necessity of having to modify code. I'd expect that to be the point of the exercise. So we'll call an implementation above, a pseudo data driven framework as it provides the illusion that the script is data driven.

So what makes a true data driven script. Quite simply a means of mapping the data parameters to the object references. The map will provide a means of identifying an object and its data value by a common reference. By following a simple object naming rule, its simple to identify the type of object being dealt with (for example, using "ed" in front of edit fields, "ls" in front of list fields). Using an object naming rule has drawbacks in terms of having to maintain the names. However, the automation tool may well provide a means of identifying the type of object through its internal mechanisms. If such is the case, then the naming scheme can focus on what it should focus on and that's the object's user name/purpose.

Having the object type means that the function needed for that object is immediately known. A function developed to take the type of object and perform the appropriate action simplifies the scripting. By parsing through the parameter list of a data file with an object to action function its possible to create a script as being purely data driven. In this environment the following changes would be required (using the same example)
  1. The data file is updated to include the new fields and remove
    the old fields. The new list entries are also updated to ensure that
    the correct items are selected
  2. The GUI/object reference map is updated with the new objects and to update the edits to lists

So it should be possible to only have to maintain data. Maintaining data is a lot less effort and less riskier than scripts.

Sunday, April 27, 2008

A Recorded Framework

Record and playback has a negative stigma. Careful use though can result in something that is usable.

Maximizing reuse is the obvious choice in any framework. With reuse one minimizes the amount of effort required in maintaining the automated scripts. There are a few considerations when looking at a tool to achieve this:
  • Does the tool support calling scripts from within a scripts?
  • Does the tool support another means of calling a script?
  • Can a script be saved as a function and called as a function?
  • Does the tool support a global object map?
  • Does the tool support associated libraries?
These various features reduce maintenance by facilitating reuse. The structure of the scripts needs some consideration.
  • How to pass data?
  • How to handle failure?
  • What granularity should be used in setting up the scripts?
These considerations depend on the capabilities of the tool. The most useful options in dealing with this are to use more advanced frameworks which in general mean that scripts are coded rather than recorded.

Sunday, April 20, 2008

Test Automation Frameworks

There is confusion over what a framework is. So what should one call a framework?

To me a framework is simply an implementation strategy. An idea that is applied to the whole test automation project. I've seen more focused definitions but they don't make much sense. My broad definition may open the door to the reviled record-and-playback, but it can be used as a test automation strategy (although with a lot of care) and so should be included.

The generally accepted primary framework types are:

  • Data driven

  • Keyword driven


There isn't anything fancy about these except the amount of code that needs to be written in order to get them to work. To keep things simple, I'll post separately on each of these types.

Wednesday, April 16, 2008

Reporting in a Technical Area

Reporting where the project is and what you've been doing is a common feature within a development project. Reporting is however not something that's liked since it takes up too much time (among other reasons).

Personally I track what I do in a TiddlyWiki. More than that seems to be redundant. The basic features of a Wiki make it a great tool for tracking what you do on a project. As a notebook it is superb and then adding the cross references (links) and index (tags) it surpasses any solution I have tried in the past.

However good this as a personal solution, it doesn't cover the needs of management. The often rigidly formatted documentation can be a real pain to work through as format constraints often exceed usability.

In consideration of this, as well as finding myself in a position with a sudden need for a status report from others, some reasonable solution had to be found. Among the things I'm experimenting with is a near-agile test automation development process. Of the things within the agile development world I've pulled in are the cycle and the need to plan what to do in a cycle.

My original idea was to use a two week cycle. This did not tie in with the classic development processes adopted outside of our group. As such I've switched to a 1 week cycle which is way too short, but it does mean that there is a synergy between us and the rest of production.

With each cycle is the associated planning for the cycle. The planning is a simple top-down approach - task with sub-tasks. The tasks and sub-tasks are prioritized (high, medium, low) and scoped (short, medium, long). This information forms the basis of the ultimate report.

For now, I've put the planning in an MS Excel file and supported the basic actions with macros. The macros pull the data from the planning cycles into a formatted template. The priority and scoping data is being used to generate an approximation of the actual effort - total and remaining.

This seems to be an effective solution. Planning is a good thing which is being positively reinforced by the least-effort status report. I just need to put some more thought into the effort estimates as well as into some additional efficiency options.

Monday, April 14, 2008

Exporing the QTP OR limitation

The notion of being able to use a reference key to access an object is important from a script creation and maintenance perspective. Sure, the basic operations seem to work fine, but its infinitely less effort to simply change a data file and object map than to do anything with code (as well as mitigating the associated risks in changing code).

In my continued search for a use for the OR, I looked at two possible options. The first was to define the objects within a page with special precursors - like "ed" for an edit field, "ls" for a list field, etc. By using these object names as the data table parameters, I figured that I would instantly have a mapping of sorts.

The first issue I realized that I had not accounted for within this structure was the parent. Sure, I have the child but I have no idea of its parents within this formulation. A simple work around was to pass the parent object to the function to handle the data table processing. Another solution that would work would to have used the entire parent object structure as the identifier - this I didn't try since it would be too ugly. Given the parent object and child name its possible to create the child object using the precursor key. The object definition could be made in the following way:
Select Case right(objName, 2)
Case "ed"
Execute "Set appObj = " & parentObj & ".WebEdit(" & objName & ")"

As a solution this works but having to track the parent object is not exactly the aim of the exercise. Sure using the entire object definition as a column header would work but doesn't make for simple data management not to mention the headache of maintaining the column headers when the application changes.

The next obvious alternative is to define the object with a reference within a dictionary. The reference would be the key and would be used as the parameter in the data table. This is a fairly elegant solution which has the following appearance:
Dim appObjects: Set appObject = CreateObject("Scripting.Dictionary")
appObjects.Add "myEditField", Browser("A Browser").Page("A Browser").WebEdit("myEditField")


A generic means of identifying the type of object so that the correct action is performed on it is a simple task. The need to do this is necessary so that for entering a value one can call the correct function be it Set or Select. One solution is the following:
If InStr(appObj.ToString(), "edit") Then
appObject.Set DataTable(keyString, myDataTable)
Ultimately this works. The issue is that I question why I've defined the object in the OR. Does the OR really save effort over using DP?

Redefining the object within a dictionary just to simplify the scripting process doesn't seem to be logical. If I throw out the OR and use DP then I've one less place for performing maintenance.

I still don't know why you would use the OR. One more possibility remains to be explored and that is the OR object model. A brief look has left it looking as light on the useful side as the OR itself but I'll give it due consideration in time.

Thursday, April 10, 2008

Using QTP with the OR

Quick Test Professional is only an average tool as far as functional regression test tools go. Ultimately the most reasonable use seems to be to bypass the features that cause the tool to cost what it does and to use the the functionality that is less advertised.

Of the so-called great features is the store for the objects that comprise the application. The first version I had my hands on was 8.2. It could store objects but maintaining them was not really a part of the feature set. 9.2, the next version I got to use, had corrected this minor oversight. However, given my earlier experience, I didn't immediately explore the changes. Having a little time, I've now done so and ultimately I'm not sure that its worth it.

The object repository is the central store of objects within a script or across multiple scripts. With a pretty GUI interface and, now at least, editing features. The structure is its downfall. All objects in QTP can only be addressed via the parental structure, viz. Browser().Page().Frame().WebElement(). This is all fine and well and admittedly does enable one to more easily access any element in the page fairly simply, it has a major drawback within QTP's design. You cannot access the element without including its parent structure.

Sure you could argue against this as being a drawback. The reason for it being a flaw is in the simplicity of test case design and scripting that is accorded by being able to simply refer to a user key or reference name. So lets contrast conventional design with simple design...

In conventional design, much like a recorded script, for each page you address each object individually with the action you want to perform, viz.
Browser().Page().WebEdit(1).Set 1
Browser().Page().WebEdit(2).Set 2
Browser().Page().WebList(1).Select 3


This is not detrimental in itself but when you have 20-odd fields on a page and 5 pages to cover within a single script it rapidly becomes a sad idea. Since a data table has to be created anyway and that data table will (or at least should) have column names that correspond to the fields within the application its pretty clear that being able to address the OR entry by the column reference would make the entire script simpler and a whole lot easier to maintain, viz.
AListOfColumns = Array("1", "2", "3")
For i = 0 To UBound(AListOfColumns)
PerformActionOn(AListOfColumns, Value)
Next


Being able to reference an element in the OR via a reference key in such a way is seemingly impossible. So far the closest I've been able to come is to pass the parent object along with the reference object string to the PerformActionOn function.

The stumbling block seems to be the identification of the parent when you only have the child. Yes its possible to create an external map of the object using a reference key to its entire parent-child relationship but that defeats the object of this current exercise.

Wednesday, April 9, 2008

Manifestations of Word

MS Word 2003 is a departure from sanity.

Must irritating must be the uncontrolled style growth. How do you manage several styles that are all slight variations of what you intended to use and would use if only the style wouldn't keep changing on you. Given that I could adequately use the style settings in 97 this ''feature'' in '03 has to be step backwards.

Today's annoyance arose out of simply trying to have table headers cross pages... highlight the row, select the option in the Table menu and ... nothing. Even after figuring that common sense was an inadequate approach and hitting Google to see what was there resulted in no better results.

Turns out that having a page width table set to have text wrap around the sides, prevents the table header from being carried across the pages.

LaTeX is my text editor of choice.

Friday, March 28, 2008

Planning

Planning is an important part of a successful accomplishment. Seems like a simple principle but how often is it actually done?

A top down approach seems to be fairly effective. On getting a task, break it down into a few high level ideas on how to resolve it. When actually starting on the high-level idea, break it down into smaller tasks. On working through the objectives add additional tasks as soon as they come to mind. This means that the list of subtasks is representative of the actual amount of work entailed by the task.

Ultimately the task list will enable improvement in estimating how long a task will take. By not removing items from the list, it forms a record of the thought process involved at each stage of completing the task.

Saturday, March 1, 2008

Hello world

Seems appropriate to start with the most common of all starting points.

What does it do? Not much. Just a way of starting out and being able to do a basic operation of displaying a message - which can be a good thing.