If it can’t fail, its not a check
One basic trait of all scripts needs to be a clear determination of result. ‘Sorta’ is not a state we want in automation. Its okay-ish in exploratory testing (as it leads to other avenues of exploration) but not here.
Here is a script I stumbled upon this week.
<pre lang="python">@pytest.marks('deep', 'videosearch')
def test_next_navigation_not_visible_for_less_than10_search_result(self):
search_results_page = self.video_dashboard_page.perform_search(SearchTerms.random_video())
if search_results_page.number_of_search_pages()
<p>Notice how if the data criteria is not met then the determination of result is going to always be a pass. In essence,</p>
<pre lang="python">@pytest.marks('deep', 'videosearch')
def test_next_navigation_not_visible_for_less_than10_search_result(self):
pass
<p>This is the sort of thing that causes failures to be reported as passes (since the check was never encountered) and the gradual erosion of trust in the automation. The fix was to fail the script if the data necessary wasn't returned and only if we have what was necessary to do the assert. (As discussed in <a href="http://element34.ca/blog/pagination-for-data-sets-you-dont-control">Pagination for data sets you don't control</a>.)</p>
<p>Just another thing to keep in mind when scanning through your automation code.</p>