fbpx

Digital Elite Camp 2019 Recap: Takeaways from Every Speaker

Digital Elite Camp 2019 Recap: Takeaways from Every Speaker

There were 167 spots available for Digital Elite Camp this year. Filling them wasn’t hard given the location and the lineup.

The event offered three days of world-class speakers, misty Baltic beaches, and all the sauna-ing you could take (not to mention a near-endless supply of daylight).

There were also plenty of casual chats as well as cameos by attendees and speakers on the piano and in the DJ booth.  

Here’s what went down inside the presentation hall.

Alexa Hubley: “How to Compete in a Saturated Market”

alexa hubley speaking at digital elite camp.
  • Product market fit:
    • Value hypothesis is validated.
    • Occurs when your business achieves $100k+ in revenue monthly.
    • Value hypothesis: why a customer is likely to use your product.
  • The role of product marketing is to define and operationalize the context.
    • What is it?
    • Who is it for?
    • How do you talk about it?
    • How do you bring it to market?
  • Use product lifecycle marketing to map campaigns to the stage of your product. The various stages are defined below:
    • Introduction stage: when the product first comes to market.
    • Growth stage: when it grabs market share (getting the consumers to prefer the brand).
    • Maturity stage: hold your dominant place in the market.
    • Decline stage: product sales start to shrink. Reinvent and milk the brand to stay relevant (e.g. Apple increasing iPhone prices).
  • Product lifecycle marketing does the following:
    • Ensures you’re operationalizing in the right context.
    • Spots early signs of a pending transition.
    • Shapes the curve of your business.
  • Follow the go-to-market strategy using the Google Sprint structure:
    • Research: ask the experts, competitive research.
    • Set targets: metrics to tackle the overarching goal for your product lifecycle.
    • Key Messaging: define the core messages and unique value proposition that you’ll communicate across every asset and channel.
    • Create Assets: align your assets and channels to the stage of the buyer’s journey.
    • Launch: set the campaign live.
    • Learn: identify tripwires and optimize the campaign continuously.
  • If your product came in a box:
    • What would be on the front of the box? (headline, subheadline, benefits)
    • What would be on the back of the box? (features that validate the headline, subheadline, benefits)

Russell McAthy: “Value of a Visit, Value of an Impression. Do You Know How Much Your Marketing Is Really Worth?”

  • The value of the marketing changes throughout different phases:
    • Impressions: for display and social media.
    • Visits: how do we define what a quality visit.
    • Conversions: what do we define as a quality conversion.
  • Quality impressions
    • Are we serving more ads to an individual or more ads to more individuals?
    • The goal of display marketing is to change the perception of the user.
    • Why do we have a 30 day look back window? Because people wanted to get paid. That’s how long invoice windows are.
    • All the logic to understand the value of display is broken.
    • The actual performance of impressions is the impact on visits/conversions that have seen impressions vs. visits/conversions that have not.
    • Post impression is not good enough to understand Display marketing. You need to have visibility of every impression and how much it influenced your actual conversions.
  • Quality visit
    • How do we monitor the quality of a visit? Conversion rate? No.
    • Conversion rate shows you how good you are at converting people today.
    • We need a metric or KPI that shows how good we are at increasing the likelihood that someone will convert in the future.
    • We often forget about an important group: people who came to the site, didn’t bounce, but still did things on the site.
    • The value of a visit is in understanding the individual impact that each specific visit has in influencing the future conversion.
    • Microconversions: viewing a product page, adding to cart, viewing a blog post.
    • For users who haven’t converted, we still have the delta between the propensity to convert to give us the value of a visit.
    • Comparing conversion rate and the value of a visit can show a brand the difference between activity that is purely conversion driving vs. long-time value.
  • Quality conversions
    • Audience-based marketing (retargeting, segmentation).
    • Don’t use cost per acquisition—use the cost to acquire a customer and the retention cost to keep that customer coming back to buy again.
    • Spend more time optimizing for the second and third purchases.
  • Define quality accurately, then quantity will follow.

Ben Labay: “How to Add Millions in Revenue with Voice of Customer”

ben labay speaking at elite camp.
  • Three strategic mental models for conducting and applying research:
    • “Knowing the name of something is not the same as knowing something.” We sell success. Speak to the customer’s pain, not the name of your product.
    • “Knowledge is equal to experience plus sensitivity.” Guidelines/best practices can sometimes distract from the sensitivity of experience.
    • “It is not enough to know.” Connect brand value to personal value. It’s not enough to understand, you’ve got to show that you understand.
  • Where do we show value, and when?
  • Four facets of an audience:
    1. User perceptions: UX benchmarking, like NPS score, is a “surgical’ way to understand loyalty throughout the customer journey. We can also identify credibility, appearance, and usability (SUPR-Q).
    2. Behavior (Friction): User testing is a great way to study behavior. It can be remote or in-house, moderated or unmoderated. Common user testing mistakes: wrong users, attempting to get data on user perceptions, leading questions/tasks, launching and reviewing all at once, and not accounting for the full range of customer intent.
    3. User motivation/goals: Segments tell us just who we’re talking to. Different segments have different motivations and different goals.
    4. Fears/Uncertainty/Doubts (FUDs): “Is anything holding you back from buying right now?”, “What almost prevented you from completing your purchase today?” Adding post-purchase polls to your thank-you page is a way to gain valuable feedback.
  • You generally need approximately 200–250 responses to accurately represent your population.
  • Don’t survey just to survey, have goals.
  • Experience trumps everything (product, price); if you’re not improving your experience, you’re not improving your product.
break at digital elite camp.

Bethany Joy: “Standing Out Online with a Unique and Compelling Brand Voice”

  • Brand voice is a consistent way of writing that effectively communicates your organizational personality to your audience.
  • You can’t just learn from your competitors or try to be better than them.
  • Brand voice must represent who you are as an organization, your mission, how you operate, etc.
  • Don’t mimic how your audience communicates but understand their perspective.
  • Your voice has to be consistent.
  • Here are two contrasting examples:
    • The Ritz: Identifies as high class, luxurious, expensive.
      • “Step into a five-star world steeped in more than 110 years of history and the finest of British traditions when you arrive at our magnificent hotel in Piccadilly.”
    • We are Macmillan. Cancer support: Identifies as a positive energetic organization helping people with cancer.
      • “We fund nurses”
      • “We climb mountains”
      • “We fight inequality”
      • “We give our time”
      • “We support families”
      • “We make coffee”
      • “We give grants”
      • “We live with cancer”
      • “We change lives”
  • These are two very unique brand voices accurately communicating who they are as an organization to their audiences.
  • 5 steps to finding your brand voice:
    1. Look at your current brand voice (You already have one. Understand where you are at now.)
    2. Clarify your brand message.
    3. Define your desired voice.
    4. Flesh your chosen voice
    5. Develop simple guidelines.

Neville Medhora: “The ‘Modules Theory’ that Makes Content Better”

  • There are two main ways people think about a blog:
    • 1. Quickly fire out tons of blog content.
    • 2. Write amazing resources that get linked, shared, and ranked.
  • So what works?
    • Do research–take some time to research the “intent” of people reading.
    • Spend time making the best post in the world.
    • Posted well-researched content maybe 1–3 times /month.
  • Rather than pumping out lots of posts all the time, creating thought out and linkable content will eventually be the path to getting your content ranked.
  • You need to think of your content as a “product”—it can be flexible, it can change, you can forever update it.
  • So how can we make timeless content (or salvage a non-performing piece of content)?
    • Step 1. Create an angle.
    • Step 2. Sex up the headline.
    • Step 3. Make a grabbing image.
    • Step 4. Add helpful “modules” to the content by: making a “cheat sheet”/summary/table of contents; making up your own unique scale or rating system; plotting out a helpful map; embedding a picture gallery; making a cost breakdown; making a calculator or custom generator; scraping and analyzing a dataset; making big lists or tables; making shareable templates; getting visitors to take any action.
  • From here, you can then break up your content into a ton of various social posts.
  • Every platform (Digg, Craigslist, Google+) will die or become saturated—your website is the one thing that stays constant.

Els Aerts: “The Lost Art of Asking Questions”

els aerts speaking at elite camp.
  • We need qualitative and quantitative data to paint the full picture.
  • How to build a good survey:
    • Use survey questions to weed out unqualified users.
    • Don’t ask predictive questions (i.e. “What will happen if…?”).
    • Don’t wait too long to ask about the experience they had.
    • Don’t ask leading questions (i.e. bias respondents with questions/framing).
    • More insights come from negative framing.
    • Use a neutral approach for best results.
    • Ask questions from a wide range of people.
    • Don’t send all email out to all the people at once. Sending them out chunk by chunk to enable a feedback loop to improve the questionnaire.
  • Survey positioning (when/where/how)
    • Give clear instructions on how to fill in the data.
    • If you want current, unbiased opinions, ask the question before the visitor engages with the site.
    • Exit surveys are good for gauging why people are leaving.
    • Be on brand when doing surveys.
  • Customer interviews
    • It’s important to talk to your customers.
    • Don’t call interviews “interviews”; invite them for a “chat,”—and treat them like chats; anticipate where the conversation could go and listen, listen, listen.
    • Record interviews.
  • Moderated testing
    • Echo questions: reflect the question back to the user.
    • Boomerang: ask the same question from the participant.
    • Columbo: play dumb, don’t answer the question; the participant will fill the blanks.
  • Always keep your mind on the initial task—what insights are you after? What are you trying to achieve?

André Vieira: “Improving Customer Journeys with Competitive Optimization”

André Viera speaking at elite camp.
  • As marketers, we compete for revenue and conversions.
    • You need to find your competitors, what they’re doing, and the competitive insight.
  • Competitive optimization is the continuous improvement of marketing initiatives through iterative exploitation of field advantages found via sequential, focused research with the goal of creating or expanding a competitive edge.
  • It targets a specific customer journey challenge vs. competitive analysis (used for general benchmarking).
    • Competitive optimization is not meant to be a stand-alone method.
    • Use insights for inspiration, not copying—we look at our strongest competitors and assume they know what they’re doing when, in reality, everyone is struggling.
  • How do we find our strongest competitors?
    • First ask yourself, then your colleagues, then turn to market data.
    • Ask your customers instead.
  • How to ask your customers about your competitors?
  • Use competitive user testing.
    • Direct comparison testing: use this if you’re just getting started; ask users to complete a task (just a slice of the journey, not the whole journey) on your website and competitors’; after they complete the task, ask questions.
    • Isolated comparison testing: use this if you already have some data; show users only one website; testers won’t get tired and you can interview more people, but you lose the ability to ask about site preference.
  • Ask the testers to think out loud, record the whole thing, do not show more than three websites at a time, and don’t talk too much yourself.
  • This process should be done once or twice a year.
  • These findings have to be turned into tests that make you money.
  • Look for two things:
    • What customers find important.
    • Glaring differences between you and competitors.
  • Don’t just implement findings—you need to A/B test first.
  • It’s competitive optimization, not analysis.
    • Insights > Ideas > Prioritization > Tests.
  • The customer wins if you get competitive optimization right.

André Morys: Why “Agile Teams” Fail (And You Might Be the Only Savior)

André Morys speaking at elite camp.
  • Top three principles in A/B testing:
    • Give people a relevant reason to buy.
    • Combine herding + scarcity.
    • Make people feel good as soon as it gets tough.
  • Build your own system to find psychological principles that work.
  • Siemens spends $10 billion on customer experience—not well spent (they actually spent it on “agile” organization, and if agile teams produce shit, you get agile shit).
  • Beyond Agile: Analyze x Prioritize x Validate learning is the full package for growth.
    • Analyze: Understand customer problems. You feel how they feel. Understand the context. Get angry. Talk about customer feelings.
    • Prioritize: Throw away 999 other ideas. The best hypothesis will change customer behavior. Analyze the factors that influence behavior.
    • Validate: Do real quantitative validation. A/B test everything using statistical frameworks.
  • Understand the reality of product owners and other stakeholders in an agile organization.
    • Agile teams usually focus on efficiency.
    • Customer centricity = increased workload.
    • A/B tests = increased workload, delays for sprints.
    • Quantitative methods add complexity.
  • Align with their processes and understand them.
    Create the new reality for management:
    • Create the process, measure the real bottom-line outcome.
    • Go bigger—don’t just A/B test what others want you to test.
    • Data = Buy-in; your results automatically create the pull you need for change.
  • Don’t call it “CRO”—your mission is not A/B testing. You are driving growth by implementing experimentation. You are working at the core of “digital transformation.”

Ton Wesseling: “Validation in Every Organization”

ton wesseling speaking to attendee.
  • Why our current CRO jobs will die:
    • “Jack of all trades” came in and started to run experiments on low-hanging fruit.
    • These people did not understand statistics but convinced management that they found “real” results.
  • Teams work differently.
    • Conversion teams run four to seven-week sprints.
    • Product teams run two-week sprints.
    • Marketing teams can run campaigns even longer.
  • Optimizers are often (too) proud.
    • Can be self-righteous, overly critical, and fault-finding.
    • They can focus on others’ failures—telling other teams what they are doing wrong.
    • Optimization teams need to be more humble.
  • Why it actually might be a good idea to kill optimization teams:
  • The name needs to be client-centric.
  • It’s not a good idea to see ourselves as “fixers” all the time. Spending resources on simple button color changes is not what we’re for.
  • Optimization is the KPI you’re trying to impact. This sometimes includes clicks, behaviors, transactions, etc. You should be optimizing for potential customer lifetime value.
  • Pyramid of evidence, from top to bottom:
    • Randomized controlled trials.
    • Cohort studies, case-control studies.
    • Cross-sectional studies, surveys.
    • Case reports, case studies.
    • Mechanistic studies.
    • Editorials, expert opinion.
  • “CRO” jobs should be killed because it’s the wrong name and focusing on the wrong KPIs.
  • How could we reach our goal? Optimization is all about effectiveness and efficiency.
  • Resources should be working on things that will make an impact and we need to be maximizing the number of positive impacts.
  • More stuff vs. better stuff—culture of validation creates more growth.
  • You have to become a Center of Excellence (CoE).
    • Overall evaluation criteria: enable evidence-based growth.
    • Going from CRO to CoE: the core optimization team may not be perfect for CoE.
  • Our big goal is to be effective—we want to have validation in our organization.

Erin Weigel: “Lessons Learned from Working in Brick and Mortar”

erin weigel speaking at elite camp.
  • How do we find the right store?
    • Traditionally, offline people go down the street to look for a store, or to the mall.
    • What’s the digital equivalent? Google (for most).
  • In the offline world, what draws us into the store?
    • Signage—you can tell a lot about a store by looking at the signage (target market, customer experience, brands).
    • What’s the digital equivalent of this? Landing pages. Each element of the landing page represents the signage of the business.
  • We’ll take a look at a few brick & mortar retail stores below.
  • Ikea
    • Ikea has an amazing customer experience.
    • Very detailed floor design, careful about which categories to display next to each other; at the bottom of the floor, you go directly through the “checkout flow” similarly to what you would find on an online Ecommerce store.
    • What’s the digital equivalent of this? Opening a new tab when clicking on a link. A tab is a good way to remind them to go back and finish the purchase.
  • Michael’s, a U.S.-based craft store
    • Michael’s sends catalogs with coupons to nudge people to go to the store and buy.
    • If you forget your coupon, you can still pick it up at the store and use it.
    • What’s the digital equivalent of this? Carter’ (another retailer) sends you an email with coupon codes and on the top of the site, they put the exact same coupon you received from the email.
  • The visual cue of a shopping bag is another example of a nudge.
    • When at Ikea and picking up your yellow shopping bag, there’s a bunch of items that have a low price (candles, napkins, etc.) at the beginning of the journey, so that you can feel good about starting to shop.
    • What’s the digital equivalent of this? The shopping cart icon. Think about what’s your “candle”? What is your “special” in your business?
  • Haba, U.S. based toy company
    • At the retail store, the customer would receive a tote bag as a reward, but not a lot of people were picking them up. They redesigned the sign to make it stand out.
    • What’s the digital equivalent? On Booking.com, they highlighted the copy “Free” with green to make it more prominent, ran a test, and ultimately, increased the conversion rate.
  • In a physical store, people often want the product that is at the highest point and difficult to reach.
    • Curb cuts are an example of accessibility for people with wheelchairs.
    • What’s the digital equivalent? An accessible product that caters to all types of people. Accessibility is a great business; it improves the bottom line.

Tim Stewart: “Leap Forward Through Analytics”

  • How useful is classic event tracking for real analysis?
    • On a lot of sites, there are millions of events that are “(not set).” These companies have no idea what’s happening on their site.
  • Using Google Analytics together with Google Tag Manager makes everything easier.
    • Add events, scripts with custom triggers, dimensions, metrics, details on page views, hits.
    • Collect data as variables in Google Tag Manager and send the values to Google Analytics.
  • Here’s an auto industry example using DataLayer variables and stock variables:
    • Variables can feed variables.
    • We took a look at dataLayer variables and used stock values.
    • Which dealer had a form completion? What’s our most popular model year? What’s the price range? How are $30K products doing vs $40K products?
    • This is information is useful for retargeting, comparing product performance from different years.
  • Analysis like this lets us save $50–60K pounds per month via better ad targeting.
  • Success Events
    • If [good thing] happens, track [good thing].
  • If you collect proper data in Google Analytics, you can tools like Tableau, MS SQL server, and R to do deeper analysis.
  • Event Goals
    • Goals are not just “Sale Complete” URL.
    • Events are ideal for microconversions. For example, not just “thank you” but which “thank you”?
    • Using events to calculate form completion rates—how many people clicked on the form vs how many submitted it?
  • You can do as many events as you want in Google Data Studio. (Google Analytics has a limit, but Google Data Studio does not.)
  • If [bad thing] happens, track [bad thing]
    • Tracking out of stock items.
    • Tracking the amount of money lost to stock failure errors. For example, “stock failure” errors per user session.
    • Often, you don’t know something is out of stock before you try it on your own on a product detail page. That’s why it’s important to track all errors automatically with Events.
  • After our product page redesign based on event tracking data the results were up across the board.
    • AOV down because we were able to lower prices because of higher volumes.
    • Ended up taking over market share from the competition.
  • If you are only targeted on revenue, not tracking microconversions or errors, you need to:
    • Think through the factors that may impact results.
    • Understand causes of bottlenecks on key ares.
    • Think about your data and what you can investigate further.
attendees of digital elite camp socializing outside.

Craig Sullivan and Charles Meaden: “Workshop: Charles and Greg’s Cornucopia of Google Analytics Tricks”

Craig Sullivan and Charles Meaden speak at elite camp.
  • What are the things that had a lasting and transformative effect on the businesses?
    • Data, people, teams and processes.
    • Work back from the outcomes to the things that make the outcomes happen.
  • Here are 11 powerups and smartcuts:
  1. Inheritance tax
    • Take over an existing account.
    • Get all the history that you can.
    • Tags that have been changed—Understand who made the updates, what changes are planned, and do changes affect time periods?
  2. Analytics auditing
    • “Out of the entire lifecycle of the analytics measurement, there is only one place guaranteed to pollute everywhere else and that’s the data collection layer.”
    • If you screw up data collection layer, it will pollute everywhere else.
    • “Free” tools are not free of effort—doesn’t run for free, needs love, money, attention, investment, and setup.
    • Auditing is the easy bit, but it’s what you do with it that counts.
    • Data collection is broken for most companies.
  3. Site walkthroughs
    • Walkthroughs are where we tear down the front-end experience and Google Analytics configuration to understand the flow of data.
    • Customer, collection, and configuration is most important.
      • What does the customer see in the browser, what tags fire when and why, and how does that data flow into Google Analytics?
    • User explorer: Add a fake parameter to the URL that you can filter later; see whether everything lines up.
  4. Early warning hacks
    • Use Google Analytics custom alerts.
    • Automate using the API or build something yourself.
    • Use off the shelf tools like Analytics Edge or Super Metrics.
    • Dashboards in Google Data Studio.
  5. Data skew
    • When your data in Google Analytics is correct, but needs to be cleaned and untangled before you perform any analysis.
    • People lose trust if you don’t fix this.
    • A few examples are the stats for homepage being fragmented due to URL parameters, subdomain and cross-domain tracking not set up, and payment gateways and third-party redirects.
  6. Data pollution
    • Extra data getting into Google Analytics by either a third party or you’re doing it to yourself.
    • Common pollution sources include: internal traffic, bots, spam, monitoring, scraping, tag issues.
    • Debugging tip: look at session distribution report, filter to users with lots of sessions and high repeat sessions, get clues from network, hostname, bounce, browser, geo-IP, resolution.
  7. Data enrichment
    • You have to earn data quality.
    • Enrich at the interaction level, dwell time, short-term goals, long-term goals, and qualitative data.
    • Capture the important information that’s shown to a user such as the category, subcategory, stock code, the current and previous price, reviews, and in stock.
    • Capture every click to build a propensity to purchase model.
  8. Data models
    • Look at yield by landing page type.
    • Group landing pages by intent.
    • Pro tip: sketch it on paper first based on what you want to see in your head.
  9. Data automation.
    • Automate when you need to run a report more than once that takes more than two minutes (or it involves multiple steps and clicks).
    • Get data quicker—you’ll have more time for thinking.
  10. Data ownership
    • Tracking is often added later, seen as a “bolt on.”
    • Integrate analytics and agile—developed feature should not be done unless it’s trackable.
  11. Training and investment
    • Training people makes the job more rewarding and fulfilling, more analysis and less report monkey work.
    • Encourage people to stay by investing in them.

Viola Eva: “Using Algorithmic Content Analysis for High-Impact SEO Upgrades”

viola eva speaking at elite camp.
  • “Relevance” for a search engine is about (technical) ranking factors, not necessarily human relevance.
  • We don’t know exactly how Google’s algorithm works, but the search engine shows its hand by the results it delivers—the cause (search query) and effect (result returned).
  • Purpose: target pages and keywords.
    • Best bet for optimization is for pages at the bottom of Page 1 or slightly behind; don’t bother optimizing pages that aren’t earning any impressions.
    • Other good opportunities: pages that already have links.
    • Your page should be optimized for a keyword cluster, not a single keyword; similarly, you want one target page per keyword cluster.
  • Research: successful competitors.
    • What do pages on Page 1 of the search results have in common? What’s the difference between my page and those that rank well on Page 1?
    • Apples to apples: If you’re trying to rank a product page, compare it to other product pages (not, for example, a “skyscrapered,” informational blog post).
    • Google processes/understands synonyms with similar intent—Google is focusing on intent more than individual keywords.
    • You’re probably overoptimized for exact-match keyword and underoptimized for related keywords.
    • Chart various factors (e.g. keyword usage in headline, internal links, image usage); if there’s a strong correlation between a ranking factor and ranking positions, it’s likely more influential.
    • Pageoptimizer.pro, seotoollab.com/cora.php (most sophisticated but clunky), surferseo.com.
    • Once you have an idea of which ranking factors are most important, you run an experiment: rewrite/adjust content; give it three weeks to see a change in ranking.
  • Insights: definite proof vs. “good enough.”
    • All tools deliver correlational insights—SEO is about well-informed tweaks, not total knowledge.
    • Focus first on your greatest SEO deficiencies that may be holding back the things you’re doing right.
  • On-page changes can be procedural, predictable, easily delegated.
  • If this strategy seems simple, it’s because Google is still (to a degree) simple.

Polly Pospelova: “How to Get a 100% Lighthouse Performance Score”

polly poselova speaking at elite camp.
  • Page load speed is highly relevant—every second of load delay impacts conversion.
    • This affects UX, quality score, cost per click, and ad rank.
    • Quality score affects your CPC. Increasing your quality score from 4 to 7 can save 50% of your Google Ads budget.
  • What is a lighthouse?
    • It’s an open-source, automated tool for improving the quality of web pages.
  • It has audits for:
    • Performance.
    • Accessibility.
    • Progressive Web Apps.
    • And it’s continuously updated.
  • You give Lighthouse a URL to audit, it runs a series of audits against the page, and then it generates a report on how well the page did.
  • The report offers opportunities for optimization and estimated savings.
  • What does it take to get 100% today Lighthouse performance score (without changing looks)? They ran a hackathon to find out. Here is what they learned:
    • 1. Always do HTTP/2.
    • 2. Defer offscreen images.
    • 3. Image quality—use WebP (vs JPG/PNG etc).
    • 4. Adaptive image sizes (<img srcset>).
    • 5. Lazy load images.
    • 6. Inline critical resources—use cookies to determine whether to inline/cached.
    • 7. Modern JS and rejection of heavy libraries.
  • Recommendations:
    • Unite with your developers (unless you can tackle Lighthouse yourself).
    • Ongoing optimization is the norm—today’s 100% will not be 100% tomorrow. Lighthouse optimization is not a on-off task.
    • After launching 100% score pages, noticeable improvement in SERP positioning.
    • “Volume of data sent” one of the biggest savings objects.

Jeff Bullas: “Influencer Marketing: Trend, Fad, or Fiction?”

Jeff Bullas speaking at digital elite camp.
  • It all begins with algorithms.
  • There’s a difference between the human and machine algorithms today.
    • You need to be a maverick.
    • You need to be a scientist and an artist.
  • What is influencer marketing? Influencing people to take action.
  • How did influencer marketing happen? It began with the rise of social media and the easy access we have today to connect with people from all over.
  • What makes influencer marketing powerful?
    • Global reach—removal of gatekeepers.
    • Addictive properties—social media built to keep users there and engaged.
    • Convenience.
    • P2P publishing.
  • Why is influencer marketing working?
    • 6/10 of people prefer a YouTuber over a TV or movie star.
    • 53% of customers purchased due to an influencer.
  • Influencer marketing today is shifting from what it was like when it first began. There are three parts:
    • 1. Traffic. Ttraffic sources are changing. There’s a true challenge in hacking the algorithms with constant updates and changes being made by the platforms to improve user feeds. The traffic sources are changing. There’s an ongoing battle to search for the best way to do “search.”
    • 2. Content. Learn to write converting headlines. Five times more people read the headlines than the content. For powerful conversation, use words that a 5th grader could understand.
    • 3. Conversion. Conversion happens when you evolve.
  • The biggest challenge for converting traffic and content—the human algorithm.
conversation at digital elite camp.

Conclusion

This is just a snapshot of the key takeaways from the stage. To get the full Digital Elite Camp experience—new connections, Baltic beach parties, and the (nearly) endless sun—you simply have to be there.

If you missed out, your next chance will be in the spring at CXL Live 2020 near Austin, Texas.

Hope to see everyone, old and new.

Related Posts

Current article:

Digital Elite Camp 2019 Recap: Takeaways from Every Speaker

Categories