We achieved a 200% increase in our client’s website traffic in 16 months. Learn More
xOn this edition of Ecoffee with Experts, Matt Fraser interviewed Nick Musica, the Founder and CEO of Optics In. Nick offers top tips gleaned from his extensive experience that will help you dominate technical SEO. Get ready to take away some helpful advice and fascinating insights by watching this right now.
If you start with the user and that’s your prime motivator. It’s a little hard to go wrong from using that as your starting point.
Thank you, Matt. A Pleasure to be here.
Yeah, in my opinion, too. I started in 2003. And the thing about SEO is, I’m sure you and a lot of folks who are listening will agree, you’re sort of doing it, and you don’t even know you’re doing it. If you’re touching a website, you doing SEO. It doesn’t matter if it’s intentional or unintentional. Roughly in 2003, I think I was coding. Coding using good old-fashioned tables, writing copy, designing, and doing SEO. And then, at some point, it was time to figure out what exactly they were doing. So, I worked with Sherry Thoreau to bring in some training into what I was working at the time, and she confirmed what we knew and what we didn’t know. And that sort of set me on a trajectory of focusing and hunkering down into SEO as a career.
Yeah. I was working for AT&T at the time in New Jersey. And we hired an SEO firm. I think it was a six-month contract. Once one through three, the deliverable was four PDFs containing screenshots of the website in question and an arrow that said put keyword X here. And that was it, that was the extent.
And at the time, it was either $6,000 or $7,000 a month.
Yeah. So, like whatever it was, four pages a month or whatever it was, it wasn’t a lot. If you were to quit that now, it would be 2 hours of work, 3 hours if you’re distracted. Suppose you didn’t know the category.
A month, whatever it was, it’s a very high rate. Yeah. So, then the fourth month, I remembered this explicitly since last we chatted. It was they sent over best practices of cross-linking and interlinking; however, you want to call it, on your website. And I copied and pasted, dump them into Google what they had in the PDF, and looking what was basically like, here are the best practices. It had nothing to do with our website it was just cross-linking is good, helps Google find stuff, is across cross-link relevance, internal anchor texts, etc. But there is nothing on your page, there was nothing prescriptive. And so, at that point, my boss agreed to go get someone to train us. So, I found a gentleman that I liked. And he said, no, we don’t want that person. We want someone who is established and has a reputation in the industry for training. And so Sherry came up as part of the training. And so, we brought her in. We did two day training, maybe three, and it was great. And she just broke everything down, were able to nod at each other and go, oh, great, we’re doing it right, or we had no idea. She just helped really solidify what was important, what wasn’t important, what had gone well, we didn’t have going well, and from there, as she predicted, she said, you know, this is sort of what happens when I do training, people get new jobs. Within four months, I had a new job. It was pretty cool.
That was it. So, instead of doing five different things in a mediocre way, it was time to hunker down to SEO, so I got my first job at a car insurance company doing SEO in New Jersey.
For car insurance, which is terribly competitive. And I’m sure this is the same thing in the car broad industry at the time even now. So, they wanted someone who knew SEO and could step in and start doing the work, which was cool. You know, car insurance is a category you must legally have in most states. And so that makes a sort of working in that industry pretty easy from a ready, steady paycheck perspective. It was cool.
That’s a lot of money.
It’s a shame.
Yeah, sure. To respond to what you’re talking about prior with the also having a bad name. I mean, it’s well deserved to have a checker’s reputation because of exactly what you said, which makes everyone’s job harder. Who has been doing this for more than a hot minute and took a class with more than five slides that say, you know, you’re now an expert, right? You know, it’s hard, and so that hasn’t changed but what has changed is that there used to be a lot of big gorillas in the room that would make a website where we get a ton of traffic, and they’re still out there. But I think one of the more interesting things that are happening now, which also makes it harder to rank, is that there’s a lot of the intent going on that Google can sniff out better than they could have a long time ago. So, simply writing is an oversimplification; simply writing good content that addresses the query is the best way to go. You know, ten years ago or so, you didn’t have to do it. You could sort of bully the algorithm a little more and get the desired results. But the opportunities today are if you can find high volume, low key, or difficulty, and I’m referring to some of the tools that are out there. But more specifically, take a look at the search engine results pages and see who is ranking for what. Is there a hole you can fill by answering the query better? Because maybe for argument’s sake, maybe Healthline is ranking really well for a long-tail query, but maybe it’s not the best page possible. So, there can be room to elbow your way in there with either one of the ten links, or maybe people also ask for another pattern on that page.
It was easier to rank back then. I think there were just fewer people doing it right and less understanding. And at some point, links became a bad idea because of Penguin and Panda. It just really just instilled fear in a lot of folks. And then sometime between when that was around 2012, and now, content has become the cool kid. So, intent with all that new content, it’s easier to imagine from an algorithm perspective, it’s easier to sniff out intent and pages that map to that intent when you have that much content to work with. So, as an algorithm and then it’s actually harder from an SEO perspective because there are all these patterns now. So, like when position zero becomes a thing, and that makes no sense, and I get that it makes sense because of the patterns, but the rank for position zero and someone goes, that’s a good idea. It is because it’s not one of the ten. That never was a conversation, it’s easier and also harder depending on your perspective.
I mean, there are so many things that have changed. The wonderful thing about digital is anyone can do it, and the terrible thing about digital is anyone can do it, like with Photoshop, anyone can be a designer, right? Like with Microsoft Word, anyone can be a writer, and with WordPress, anyone can be a publisher and SEO. I mean, no, you’re not. It doesn’t work like that. For instance, I’m going to use a hammer very differently than a carpenter will use a hammer, and I will walk out with a bloody thumb. Right.
And they’re going to build a wonderful house for someone to live in. That’s not me. So, these systems are wonderful. We do a couple of different types of technical audits. We do a mini, and we have a full audit. So, the audit is typically reserved for folks with a legacy technology type. A cold fusion, ASP aspen, or a combination of those things. Typically, it’s going to be banged up pretty well at the time it was quote-unquote, working for what they were looking to accomplish, but now, for whatever reason, SEO is interesting for them, and they come to me now, and there’s a very specific lens on that website, be it SEO. That will get the full audit because there will be problems to uncover. The other CMSs, WordPress, Shopify, etc. We typically run a half audit on those. So, we take a look at the same things, but the number of issues that show up is typically less, so it just takes less time to go through it. But out of the box, they’re way better than legacy, but they’re still not optimized ad, so one example is Web Glow, which is the new cool kid. At least twice that I’ve seen websites with canonical tags banged up for whatever reason. So, that’s not going to help out. Among other findings, we’ll call the canonical tag rate the high-level priority. WordPress, the wonderful thing about WordPress is that it does everything in the world, and we do a lot of work with it. You want us to do something there’s a plug-in, right? If it’s maintained, if it’s saying it’s SEO friendly, please don’t think it’s SEO friendly just because it says it is. What is a wonderful thing about WordPress that’s a little snarky, what I just said leading into this next comment? When you take a look at the sitemap, sometimes when you put a new plugin in, you going to have instead of just pages and posts, authors, and whatever else may be in there by default, you going to have FAQs, testimonials, you’re going to have a bunch of other WordPress type plugins that show up. And sometimes, those plugins create XML statements for snippets of pages. So, now we have issues around the site maps creating essentially pages that are snippets of pages. And so now we don’t have actually ten pages of a website. Again, for argument’s sake, for a small website, what we have is ten pages and if they were to use ten different plugins to infuse back in the day was almost an include as is may include now we’re using plugins to do it. What we have now are other pages that are associated with those plugins. So, now we have ten real pages and ten plugins creating extraneous pages that are getting crawled and indexed and are competing with what we consider to be the pages or the posts.
Right. Absolutely.
It’s almost like how Google would tell you like there are 200 types of algorithmic considerations. But within each one, there are certain considerations. So, we’ll take a look at the meta descriptions just using this as an example. But then, do they exist? Are they duplicate? Are they too short, or are they too long? We may say there are ten things we look at, but within each and everyone, there are five facets of whatever looking at. And generally speaking, I can tell you that we use this as one of our applications. And so, we have about 100 items in there that we just copy-paste and start the audit of those.
Yeah. But then you get into it, which may be good for many audits. Then you get into some of the bigger, more complex legacy stuff. It’s helpful to discover, but sometimes you have canonical tags which are doing something odd, or you have mixed cases, or you have URLs, or you have three or ones that are looping because of the status code, because of the canonical tag. And so, the 100 items that we’re checking for will not necessarily catch all that. It’s a starting point for catching all that.
That was I mean, that was it. And honestly, I never intended to do technical SEO like this. I was always a content guy. My background was in journalism. So, I’m really a content nerd from that perspective. But every time I want to go look at someone’s website, something technical always shows up every time. So, like four or three hundred. Even back in the day, not so much true today, but back in the day, the double standard. Oh, yeah. Secure securities are not secure. It’s like, well, you can have four different websites.
That was a big thing.
Some people still don’t.
I don’t think it matters. It’s just picking one.
Yeah. It would definitely be a Legacy, and I think it was Drupal.
I think it was Drupal.
And it sort of doesn’t matter because the technology, in this case, is not specific to the platform. It’s because it’s specific to the site’s construction time. And it was the version of if it works, and SEO find out? So, there were a lot of tags and categories that were being misused. There were a lot of smaller pages, like the FAQ page with 200 words, and multiply that by a lot of pages, and then we’re not getting the result we want. Well, there are solutions to all these problems. But, you know, when you multiply a bad practice by hundreds of instances, you tend to get a bad result. Actually, I would love to talk about when I used the canonical tag wrong. I think this is really helpful. So, I misunderstood of working at a job, two jobs ago. My second last job before starting the agency. Where I was working at the time, there were going back to the domains. This was a GMB network, which sounds much like a GMB, also as a private publisher. We just happen to publish a lot of GMB-related content. So, we have local.gmb.org or www.gmb.org and search.gmb.org. And it was considered one website. And from my perspective, like yours, that’s at least three websites. Let’s understand what’s on them. But the domain was considered one website, regardless of any subdomain. are you following me?
So from that perspective, it was coded that way. So, Robots.txt was coded for one website. One domain, but three websites. So, we shared it with three websites. And me coming into this perspective thinking about how I do subdomains. It gets its own inside map, its robots.txt, etc.
Right. Well, of course, it does. So, I requested that the canonical tag be changed on one of the other www.gmb.org. Unbeknownst to me, it was changed on a local dot in a way that didn’t make sense. Local Dot lost a ton of traffic about seven days after the canonical tag was changed. How did it happen exactly? I don’t know. This certainly was my request, and I didn’t know where to look for it because it was my request. But I believe every page got canonical ties to the homepage.
Right. Because local dot was like local.dmb.org/topic/location. Basically, of a business, car insurance, TV location, or whatever. So, they all got canonical ties to the home page, which, according to Google, certainly at the time was not directive. It’s a suggestion. Right. Well, no. In this case, they took it as a directive, which they shouldn’t have. They should have recognized this as someone goofed up their website. Ignore it. Continue business as usual. They should have. But they didn’t. So, it took a little bit to find the issue. We fixed the canonical tag, and then I learned we’re using Robots.txt and the same file on all the sites. And so, this started a big conversation about what we mean by the site, a.k.a, the domain, not the actual three domains. That technical architecture got detangled and made things better over the long run. Certainly, however, at that time, we lost a lot of traffic and a lot of revenue. I think it took four weeks for it to come back fully.
Right, not understanding the technical architecture and what I thought was a simple request on this website impacted the other one.
I kept my day job. It was a really great place to work for a very long time. So, no harm, no foul. And we understood why things happened the way they did. We corrected the problem, and it helped to facilitate. As I said, there are other things. But that was my first real live experiment of what Google says and what Google does are very different things. You’ll hear, you’ll make cuts, or Mueller today saying this phrase in so many ways, we’ll figure it out. It’s okay. We’ll figure it out. Well, that’s not true. But that’s not true from my perspective. You want to hedge your bets. You want to have them not make a decision. You want to use the canonical tech correctly. You want to use the no index tag correctly. You want to ensure your site is as tight as possible, or else an algorithm will decide for you.
Right. And in this case, it was. The canonical tag is only a suggestion. Well, unless, of course, they say it’s a directive. It was interpreted as a directive.
Yes, it is important. No, it doesn’t depend on the type of website.
Yeah. I mean, so your best chances of having all the controls work for you robustly exceed canonical tagging all the tools we have at our disposal. The best chance you have working with them well is to come out of the gate strong and have the controls in place when you’re coming out of the gate with the new website. Let’s pretend it’s a new website. So, with the new website being super intentional about why we’re using something and what the expectation we’re trying to get or the result we’re trying to get from it. So, we have experience there, and if it doesn’t work out and we see pages in the index, maybe there’s a reason we should give something up. Let’s go back and take a look at our controls. Robots.txt is one of them. Google should be coming to the website, hitting robots.txt, getting us directives from there, and then going about its business.
And sometimes that’s not the case. Like your file that you told Google to stay away from, be it a robots.txt, has now been crawled, and indexed, now it’s out in the wild, sometimes that’s an issue, and sometimes it does not depend on the scale of it. SEO is about how well you did in volumes or how terribly you have done in volumes, those are on both sides of the spectrum there. So, you want to fight for the click. You want to ensure everything you’re doing is above board, All your controls are in place, and you’re using the right way, or else you’re causing confusion for Google because robots.txt says this. Canonical tag maybe argues with it. For example, if you have canonical and no followers on your page, that’s a mixed signal. Google now needs to make a decision. Don’t let it make a decision. So, being really intentional about having an indexation strategy for a website, being an old site or a new site, what we want to be indexed, what we do not want to be indexed, and putting the right controls around it are helpful. If you’re not getting the reaction from Google with robots.txt, you can use the robots no index on the page. But now we just got into crawler cap considerations where if it’s a ten-page website, who cares if it’s a thousand-page website then we care a lot.
Yeah. That was the biggest website I worked on in my career to this date. www.dmb.org had about 5000 pages on it, let’s just say.
We rewrote all the content on that site. So, it’s fresh up to date, and as good as it could be. When researching a government website where some states are better than others, it typically comes down to this, if the state has a good budget, their DMV-related sites can be pretty good. Therefore, at the time, we can research the content and distill down the information to be more actionable. But if it’s a less funded state, then the content is harder to find, if it even exists. And it’s harder to answer the questions because sometimes the content is not as good in those less funded states. So, we did the best we could with what we had to work with copy-wise, and I believe we produced a pretty good product.
That’s right.
Yeah, I was there for six years. I started in 2012, and I left after six years. And when I started the site, I got 85 million visits a year through SEO.
When I left, it was getting 185 million a year, largely through SEO.
Not a lot of it was developing new content. And the content wasn’t good when I started, it wasn’t good. It all had to be rewritten. So, I want to say that the content didn’t add to that value. It certainly did. But I did the work, and the site was banged up technically. And with a site like that when it was that banged up technically and it was still that good. You just sort of had to remove the blockages from Google, and now it could just crawl way better and more efficiently. It was for this website. Suppose it could crawl this website more efficiently and more frequently. You can get to the same page more frequently. The trust value of that page increased. Therefore, the rank on that page started to increase. You multiply that out by 1000 -5000 pages, whatever it was. Well, now you go from 85 million to 185 million in six years. It was pretty great to see that.
Yeah.
Yeah. In this case, I would say cross-linking is your friend. Interlinking is your friend. Because if it’s not going to be for levels, it’s going to be pagination that gets you.
Right. So, it’s either starting the home page and then A to B to C to D, or it’s going to be a few categories of content. But you click on one, and you’re like, how many pages are in the pagination? So, you basically just have to help the little Google bot out. Be Google bots, friend, show it around your website, just lead it around, and you do that to get back into the technical but clean technical architecture. So, it’s not chasing its own tail, and you’re not giving it mixed signals. All that stuff is taken care of. Now we’re just talking about access. Give it the right access in the right context for the access. Four layers deep, maybe that’s perfectly fine depending on the website, the site’s quality, your internal leaking structure, and if you can navigate by clicking on department category, subcategory, or destination page without different levels or categories of the category destination page. But just make it easy, if a user can click on it relatively easily and find that endpoint, google should too. To a large degree, assuming clean architecture or clean but assuming a fast website, all those things.
I love Screaming Frog. We used Deep Crawl because it was a bigger Web site, and we had to keep on top of it a bit more, which was the flagship website. So, we wanted to make sure it was regular that we were getting reports and that if anything came out of a little wonky release, we had visibility into it. So, that’s why we used Deep Crawl there. But our go-to tools are frequently across Google search console and Screaming Frog if we’re doing technical. But also, you can’t just run reports. You can’t do some websites. Largely you need to poke around. You need to understand the website, or else you’re not going to gleam the issues out of those reports in and of themselves or your understanding of what the issues are going to be superficial more than holistic.
We started the conversation off this way. But there have been agencies that just do a disservice because $1,000 a month to a big agency is not the same weight that $1,000 a month would carry with a smaller company.
And so they’re going to get to the bottom of the pile, and you’re going to have someone working on your website who may or may not know SEO, or they may leave your website, your account to go on to the latest account that’s worth twice as much or whatever it may be. And I have one of these folks, they came from an agency that we don’t need to talk about, but they came from an agency. And it was really difficult for me to come to grips with him. But you’ve been working with an agency for 12 months.
Well, they didn’t do you any good. They just didn’t. So for 12 months, they got the service, and we did the full audit on this website. It wasn’t the mini it was full because nothing was taken care of.
Totally unfortunate. So, now this business owner is at least 12 months behind on her results and is frustrated. I need to sort of dig myself out of a hole that I didn’t even create because this other agency created it. And ideally, get the account and ideally be able to work them productively because she was not happy.
For sure.
So, Robots.txt. We want to ensure that they’re included in a search console and that there’s not everyone under the sun speaking a WordPress that’ll produce a lot of them. Potentially depending on the plug-ins site speed. That’s tricky because you have the same plug-in on the same website and will act differently based on two different hosts. So, you must be sure your hosts, your plug-ins, and everything are lined up, or else you may get some wonky results there. But you’re talking about page speed from the perspective of conversion, from the perspective of SEO, you know, having a site that lags when there’s a couple of pages is probably not a big deal. Having a site that lags when you have a couple hundred, like if you plan on building out, if you have a publishing model, for example. You better pay more attention to it. It’s going to be more meaningful in that context. So, work on it as best as possible. But then you run into this problem, too, right? So like, WordPress is great because it allows you to do everything, but at some point, you’re just weighing it down with all the plug-ins. And also, if you can get a hundred arbitrary scores, good for you. Make the site as fast as possible. There’s also this philosophy where at least we have, where we don’t talk about page speed, which is where do you ramp up next to your competition? Like how do you rank in terms of speed?
Because how my client ranks compare in page speed to Apple, for example, like that’s just not relevant.
It’s all about contextualizing.
And also, it’s one of 200 signals. So, like, you can’t be in the doghouse, but how fast you really need to be to get the results you’re looking for.
When we have the core vitals, people get obsessive about a bunch of these things. I have a client with whom I do more consulting than anything else, and they made one of the best sites you can make technically. However, they got dinged in May. They got dinged because it wasn’t about the technology. I think with an algorithm update, it was about the technical. It was about other things.
And so everything was in play. Robots.txt sitemap came out. You name the technical, it was at play. It was really good, site speed, everything. It was all good. But for whatever reason, maybe Google didn’t like the website because it likes to just rag all websites every once in a while, or because other things were going on. That’s the case here.
In our audience. We include it in the content.
We don’t get into too much of the details around it, but we’ll call out that you have a bunch of content and screenshots. This is more of a content conversation rep, but the whole issue is talking about it.
Well. But let’s take that question without the no lectures. It’s most likely going to show up as a soft 404. If it’s a 200 and there’s no content on it. It will show up as a soft 404 in the google search console.
Which is bad. You don’t want that. You want a hard 404 where you can get those. But this most likely will turn into a soft 404. If it’s one page, that’s one thing. But if it’s about 1000 pages or whatever the number is that hits the threshold and makes it a bad time for your website, then you should probably nuke all of those pages. They should all be very hard 404s. When you throw the robots.txt, you say no index. So, again Google does what it says, for whatever reason, because maybe they found these pages before, and then you put the tag on top. It may or may not solve the issue. So, recently and this is another website that I worked on, a long-term client. They got dinged in May. It was a different client. And they got dinged because they were testing out a, we’ll call it, an affiliate shopping cart of sorts. So, they grabbed the feed and created a lot of pages. There were thin pages with no index on them. But Google already found some on them. Not all of them. Not 6000. They’re like 6000 product pages compared to 1500 content pages. So, the balance is no way out of whack. However, no index was applied to these product pages. Did it matter? Google found enough but didn’t honor any index because it found them. And when it returned, it had a chance to find the new index. But it didn’t matter. Tanked the website. Wow.
And it was excluded. It wasn’t supposed to be part of the Google game, right? Yeah, but. But it was. They’re trying to leverage affiliate links in a feed for their audience, which are good candidates for the products that were available.
Right.
I don’t know. But 80% of traffic went missing. It was terrible. So, we took all those pages off, made the corrections, and it’s ready, steady six weeks later. After a false recovery and it went back down to almost nothing. And now it’s back to where it was in mid-May. This is a core algorithm update. So, going back to another one of these. Google says something is a true perspective. More than once, I’m sure you’ve read that there is a core algorithm update. There’s nothing you can do. Just make your site better. It’s a terrible language. Or it’s good old-fashioned SEO. Mostly we in Google ignored the no index. So, we took off, removed the 6000 pages, and the site came back magically.
Yeah. Without having to go to Shop.domain.com. Something like that.
And after this experience, I would wonder, would that hurt the domain overall? I don’t know. I guess you are just transferring these bad pages to a subdomain. And so, we don’t like your entire domain now, maybe.
It sounds odd.
It always creates this page.
I mean, Flash is showing up in my brain. Back in the day, Flash was awesome. People loved Flash. It was so cool.
I wanted to learn it. I was awful at it. You could build an entire website out of Flash; the only copy on the page was the title tag and web description. It was just terrible. But that still exists today. At least the spirit of it is like a cool thing. Because I thought it was cool.
Totally. It’s such a cop-out answer, but it’s true. Like, if you start with the user and that’s your prime motivator. It’s a little hard to go wrong from using that as your starting point. You can go wrong. But you’ll go way Less wrong. Like is the navigation clear? Are the labels clear? And if you’re like, this is cool, then name the thing, the label, the page, whatever it is, something I think is funny, like probably not going to be effective, but if you’re going to name it something that a user will identify with, that’s way better. Simple.
Yeah, totally. Some of my favorites for what we’re talking about here, like if you’re a web designer who wants to learn more than just Photoshop, shall we say?
There’s a firm I used to work with, the understanding group. Are you familiar with these guys?
The understanding group. So, they are out of Michigan. Peter Morville, who is the godfather of information architecture, is associated with them at some point. I worked with Peter when I was a teammate in Oregon. Then he got too busy because he was writing a book, so we started working with Douglas. Coming back to the point. Information architecture, I can’t think of a better group then than those guys to work with if you’re looking for training. And this is just my perspective. Bruce Clay has wonderful training. He’s probably, the sharpest technical mind out there. A super impressive guy.
Yeah. I forget the name of the book. It was written by Kelly Godo, it was something about the process of building websites. It was great, and the process, as far as I can recall, is still relevant today. But to your point, people said, just give me the website.
Definitely.
But we write much long-form content, blog articles, whatever you want to call it. And folks are like, well, can you write a page, or can you write the home page and write the about this page? Can you write a blurb for this page? No. We don’t have anyone scheduled to, and we don’t do that. We’re not going to try to do that.
Let someone else who charges way less get it wrong, and then you can fix their copy.
Yeah.
Oh, God. Yeah.
So, let’s break down the content into two definitions. One is not enough words on the page; the other is you can have a thousand words on the page, but it’s not keyword-focused.
Words are poorly written, or it’s fun, or something like that.
Well, we’ll call it old-school SEO copy. That’s sort of how people think about it. It’s awful. So, if you have an article page, that’s probably 200 words about it. I mean, what can you actually communicate to someone in 200 words that’s.
Worth talking about. Nothing. So, if it’s a category page, maybe. If it’s your home page, it should have a little bit more than 200. But certain pages will have less copy, and they’re pass-through pages, and then you’re going to have destination pages that have more copy. And that’s just sort of how it works, right? The home page, by definition, is a very important page. Also, it’s a pathway page largely. It’s designed to get you from the front on page to something else. Ultimately category pages. Same thing.
Paginated pages. Same thing. Ah, so less copy but context. So, give users an inbox context to get to the destination. When you’re talking about the destination pages, and we do this, many folks will charge when they’re building a copy. They’ll charge based on the words per page, and we sort of do that, but we back into it with, you’re going to buy a thousand words from us or whatever the magic number is, and then we’re going to build somewhere between 20 and 30 pages. And they’re going to go, well, how many pages? Well, we’ll use as many words as appropriate for each page. Some will be 800, some will be 1200, and that’s sort of our range. Now we’re talking about the best copy for the best page, not a thousand words, because that can be too much and thin copy, or it can be related, which is another version of the thin copy. We didn’t even get into the meat of some of the topics. So, we cut the page short. So, like, what are we talking about?
Yes. 100%. That’s part of the art and the science of it. So, the more you know about a topic, the more you can sort of imagine how to answer that question.
If you had a page, for argument’s sake, how long does driving from New Jersey to California take?
Well, it’s going to take 20 hours, something like that. If you don’t stop. Chances are you’re going to stop. So, where are you going to stop? Here are ten places going south that you should stop at. So, if it is 20 hours or whatever, you could go up to a thousand words because it’s relevant for that query, like, I’m going to have to stop.
Right. The best rest stops along your trip across the country.
So, the question is, should that be part of how long, or should it be a link out from how long it takes to its own page along with the ten best restaurants? Not to miss the ten best places to actually stay. Right?
Yeah. Jersey to California dot com.
You can search for me on LinkedIn. Nick Musica, music with an A on LinkedIn, you can go to opticsin.com. It’s not about glasses, it’s about SEO. Those are two good places where you can find me.
Not so much.
Well, thank you very much. I’ll make sure we put those in the show notes. And I just want to thank you for coming on the show.
You very much. Appreciate it. Thank you.
Buy 1, Get 1 Premium Backlink FREE!