Helping Google stifle Black Hat SEO

Search Engine Optimization (SEO) seemed to me a strategy that web vendors and spammers used to generate traffic to their sites. As a web user and novice web designer, understanding how Google separates white-hat and black-hat SEO—literally, the good guys from the bad guys—helped me make sense of it a little more.

Google also depends on regular searchers to help by pointing out when something is wrong with your search results. If you do a search, and there is something that doesn’t fit or is sketchy, you can point it out to Google. Kevin Purdy, in his TechRepublic article “Give Google better feedback and get better results” shows us how. 

screenshot of Google Feedback link and popup

Have you ever noticed the “Send feedback” link at the bottom of your search results and thought, “Yeah, no thanks. I’m not writing an email to Google or going to another tab to fill out a form. I just want some better search results, so I’d rather spend my time trying it again.”? As it turns out, it’s Javascript that keeps you right in the page, where you describe in words and then show Google by highlighting what was wrong. And you can go back to your search. Google takes the feedback seriously and uses the user feedback to improve its algorithms. Maybe you don’t see the direct results, but it’s for the “betterment” of the web!

I also have to give credit where credit is due, I learned the most about SEO and what Google does from this article I read in another library school course, LIS 451 (but liked Purdy’s visual view of one of the biggest takeaways I got from the article): Cahill, K., & Chalut, R. (2009). Optimal Results: What Libraries Need to Know About Google and Search Engine Optimization. The Reference Librarian, 50(3), 234-247.

Reference:

Purdy, K. (2012, February 21). Give Google better feedback and bug reports and get better results. TechRepublic. Retrieved December 5, 2014, from http://www.techrepublic.com/blog/google-in-the-enterprise/give-google-better-feedback-and-bug-reports-and-get-better-results/ 

Novice Searching

It is easy to become overconfident as a novice searcher because simply finding an answer to a research question seems to count as a success, but that answer does not confirm that your search was exhaustive and includes the best results out there. However, when challenged with exercises based on more complicated or unfamiliar information needs, I found that my searches were sometimes misguided or incomplete. For example, it seemed like when I lost points on course assignments, it was usually for this reason: I either didn’t really understand how to find the answer or I stopped searching prematurely.

In Suzanne Bell’s Librarian’s Guide to Online Searching (2012), she provides a “Searcher’s Toolkit” in chapters 2 and 3. As I studied chapter 2 on the use of Boolean Logic in searching, which she described as

“the most fundamental concept of all… In fact, this concept is so fundamental that you’ve probably run into it before, possibly several times through grade school, high school, and college. But do you really know what Boolean logic is and how it works? Do you really understand how it will affect your searches?” (p. 19),

I thought, “Finally, a practical use for that semester I spent in Logic class as an undergrad!” Of course, I proceeded to read the chapter thinking of how the application of Boolean logic to my own Searcher’s Toolkit was going to be a simple, yet valuable addition. Indeed, it was; when I implemented it in an assignment to compare databases, I quickly got a nice tight package of results. I found out later from my instructor that I actually didn’t quite understand the “Order of Boolean Operations” (Bell, 2012, p. 23) necessary to successfully use Boolean logic in a database and that the use of parentheses helps a lot, just like in mathematics.

I’m actually still not sure that I always do it correctly, but I think it’ll come with practice. As I continued working in my chosen database, I know I was very bold with my use of Boolean logic in search terms and often probably opted for results with good recall over precision by searching broadly (Bell, 2012). It takes longer to sift through the false positives, but I was more satisfied not to miss relevant results, especially with a database like Ethnic NewsWatch that indexes a lot of newspapers—sometimes very superficially.

Throughout my semester group project (on Health Resources for Latinos), I also found it valuable to use some of Alastair G. Smith’s (2012) Internet search tactics that I wasn’t as familiar with before. The BIBBLE technique proved especially useful toward the end of the project when we realized we needed to find additional resources to complete sections of the LibGuide that were sparse (Smith, 2012). Webpages that had already compiled authoritative resources helped us fill in the gaps and saved us some time. In finding demographic information about the Latino population, we used the CROSSCHECK technique to be sure that we correctly represented Latino culture, especially since none of us are of Latino background (Smith, 2012). For example, before we summarized common Latino health behaviors, we consulted two or three scholarly articles on the topic for consistency, so as not to stereotype or overgeneralize.

We were unsure for quite awhile on how to reconcile the focus and audience of our final project with the use of academic databases. Unfortunately, it took us until we had to write our learning objectives for the final presentation before we had a clear idea of whom we were targeting and how we could arrange the LibGuide. While we were able to come up with scholarly resources all along, it seemed a little backwards, maybe even wasteful, to conduct a large-scale search on a general topic like “Healthcare for Latino Immigrants” and then decide later if it was useful and how we would organize it.

Once we established the sections of the guide, we had to reevaluate where we had holes in our research and search again, which felt a little bit like starting over. Perhaps an outline of the guide earlier in the process might have been more efficient, but I’m not sure we would have had as global of a view or encountered some of the resources that were the most valuable. For example, the plain language focus was a serendipitous find. I don’t think anyone had it in mind as a search term early on, since there weren’t resources from the academic databases specifically about using plain language in health care. It was a topic spawned from the section on improving care that we had to scramble to develop because we missed the idea in the planning stages. We likely would have overlooked it completely had we been asked to come up with an outline of the organization of our LibGuide before we tackled the databases.

Likewise, when we finally focused on our audience as the health care providers serving Latino immigrants, the final searching and organization process accelerated immensely. However, if this had happened earlier in the process, again, we may not have stumbled across some of the resources we gathered with broader searches. Regardless, even though we narrowed down our resources to a reasonable selection, I couldn’t help but wonder if the task would ever really feel complete, especially given the depth of information out there about nearly everything.

Virtual Reference Interview Fail

I made a chat reference inquiry with a library worker from Florida’s Ask a Librarian statewide service. This was my question:

I’m looking for information on Richard III – was he healthy when he died, besides his humpback? It’s for a school project. Thanks!

I admit, I was a bit of a mole, since I already knew the answer I was looking for and part of the intent was to see if the reference staff followed current events/newspapers.

Overall, the experience was a bit of a let-down, especially given the very positive experience with chat reference I had with an academic librarian a month prior when I was searching for a reference cited in a book I was reading and needed digital inter-library loan to get it.

The Florida librarian did not conduct a reference interview beyond the fields in the form I had to fill out in order to log on to the chat. (My name and ZIP code were required. My email was optional. I had to choose from a drop down menu that I was a graduate student. This was also where I entered my question.) Once the librarian logged on, she proceeded to search for resources and send them to me. Her only questions were closed questions, such as confirming that I could open a link or that I got her emailed article.

I was actually surprised with how rushed the reference chat felt. There were several times during the interaction where the librarian tried to pawn me off on searching for books in my local library’s catalog because she wasn’t finding suitable results for me with her resources. Had this been a serious query, I would have left feeling like I was on my own and wondering why I had consulted a librarian in the first place, because she gave up on the search. I tried to give her feedback on the content of the articles, but she never did provide anything that actually answered the question. As she started to end the interaction, I was very tempted to give her a hint that maybe there would be forensic analysis somewhere, in light of the news of the discovery (since I had already found such an answer in a newspaper article). However, she ended the interaction quickly, without checking if she had met my needs or waiting for my final “thank you.”

I suspect that, toward the end, she was in a hurry to finish up because the chat service was closing in 30 minutes, even though she hadn’t completely helped me. It is also possible that she figured out that my ZIP was not a Florida ZIP code, and, therefore, she had little obligation to help me because the service is for Florida residents. Since I also had to list on the form that I was a graduate student, she may have been less willing to try as hard for me because graduate students are usually more competent at searching on their own.

If I had been the librarian in this situation, I think I would have tried to learn more about the assignment and asked me what I had already found, instead of throwing resources at me, hoping I’d go away. I think that her attempts at the search were not successful because she didn’t actually conduct any sort of reference interview. It seemed like she was more interested in the mechanics of doing the search and completing the task than actually meeting my needs. I was also very put off that she didn’t confirm with me that I was satisfied with the interaction and did not even give me a chance to say thank you—to me, a librarian should always focus on providing this kind of customer service.