New Feature: Profile Scores

Thank you for the info! Please give whomever came up with LEx Luther a big high five for me.


That would be @aaron :raised_hands:


Looks good, great addition.


This looks like it will help! :smiley:


We’re liking the new rating system. Thanks for the continued improvements. Cheers!


I think this is great, we also like the storage detail regarding temps. Great time for some changes.


John, First off, I have been looking at the profile scores for a few months and think they are a great addition, well done.

In the effort for constant and never ending improvements, here are some suggestions to consider.

Think about adding a time limit on pulling your data for calculations, that is historical for when someone joins the Exchange. Pulling all historical data on response time, ratings etc will skew the numbers to a historical plain, not telling the purchaser what the selling is currently doing.

Case in point, Randy from Mar 16 post is penalized because he did not send back a message to the buyer for thanking him. How many years will this be pulling down his score. He would be better off starting over with a new name. In my opinion, this metric should be started with the metric rating rollout and only look back 1 year. This would let the buyer know what the seller is currently doing.

Same with ratings. With an ag product, crops are different each year. Heat domes and wildfires made a difference last year. Many European hops are not irrigated, so they have even bigger fluctuations in quality. I have one bad rating from a buyer that did not tell me he got the wrong crop year. Zero communication till he sent in a rating. No chance to fix the issue, by replacing, refunding etc. A great addition would be to require all low rankings to contact the seller first, then make a rating.


Glad you like the scores. It was a far more work than I bargained for but we’re also happy with the results. Thanks for the suggestions for ongoing improvements. I’ve made a few quick responses below to some of the various points you brought up.

What you suggest was considered during the design phase but we decided that using all-time data would give us better numbers in more scenarios and better reward members with long-standing track records. Take, for example, a brewery that needed to sell a bunch of hops in 2020, then zero in 2021, and is now selling again. Seeing their past performance is far better than having no score at all.

Randy will forever be penalized for that 1 missed response, but the penalty decreases with each of his subsequent timely responses. It looks like he’s already up to 80%.

The quality of each lot is rated via a separate process and is not part of the Reviews metric.

Sorry to hear about that. We already encourage folks to communicate with the other party to resolve issues first. This seems like an isolated event but please let us know if you see a lot more of this so we can consider how to best address it. With a score of 99.6% and over 1800 positive reviews, I don’t think you need to worry about 1 bad review, but we can always remove a review if the person who wrote it requests for us to do so. So, you could always message them directly and ask them to rescind their review.

1 Like

Im liking the new system, seems to be working well. Somewhat odd that for Growers “recent purchases” factors into our score


We recently made some changes to Profile Score calculations. Specifically, we’ve changed how your Community Engagement score is calculated. The changes have been added (in bold) to the original post above. These changes only apply to orders placed after 4/1/23 and they do not affect merchant/grower accounts.


I’m just looking at this as I noticed our score is relatively low at 74.2%. I guess I was dinged on the same non-response issue though there was nothing that needed a response. I also feel like this score/ranking algorithm is only going to elevate and favor the hop brokers who list here as that is their main business… while us brewers might check their emails only a couple of times a day and may only have time to respond once/day and ship a couple times a week. So the brokers are going to end up at the top of the list while the brewers will get further pushed down. Wasn’t the exchange originally setup so that brewers (not brokers) could sell/buy hops from other brewers?!


Hi Chris,

First off, a profile score of 74.2% isn’t that bad. You’re only 5.8% away from “LEXcellent.” But your ave response time of 5 days is definitely hurting your score. Failing to respond to messages (order 143545 for example) hurts that metric quite a lot. There are plenty of brewers with response times better than some merchants, but there is a wide range of response time (and all other metrics) across both brewers and merchants. Here are some quick examples of brewers with great response times:

Buyers expect a high level of performance from any seller (brewer or merchant) and profile score metrics help folks understand exactly what they’re getting into when considering a listing. That’s why we devoted so many resources to profile scores ~18 months ago. It was a massive, long overdue project.

Your profile score will improve if you do the following:

  • Respond to messages quickly going forward
  • Read community posts - that includes old ones like this, which means your community engagement score (currently 0%) will have increased tomorrow!
  • Rate seller performance & lot numbers on future purchases
  • Keep doing what you’re doing for shipping speed (this is the most important metric for many buyers)
  • Maintain your 100% review & reliability metrics (nice work!)

Yes, The Exchange was originally closed to merchants and I lost plenty of sleep over that decision. After a few months of operation and lots of feedback from the community, we learned that a completely free & open marketplace was what the majority of brewers wanted. That was definitely the right decision & what the industry needed. What started as the Ebay of hops became the Amazon of hops, where brewers can find every variety, every brand, every vendor, real market prices, reviews, response time & shipping speed metrics, compare listings, and more. One Stop. All The Hops!

It’s definitely a buyer’s market and will be for years to come. This is a good reminder for everyone to pay attention to their profile scores now because you’ll want it to look good if you find yourself long on a variety and need to sell.

1 Like

Our Memorial Day Giveaway survey uncovered that at least a few folks felt that our profile score calculation was biased. Additionally, I felt the need to both simplify the math and increase the weight of several metrics. Therefore, the following changes have been made:

  • The number of recent sales or purchases by any given user is no longer a factor when calculating scores.
  • Each of the 5 remaining metrics are now weighted equally.

Here’s a quick reminder of how each metric is calculated:

  1. Reviews = 100% - (Positive Ratings Received / Total Ratings Received)
  2. Reliability = 100% - (Orders I Canceled / (My Sales + My Purchases)) *Only applies to orders after 10/15/2021
  3. Shipping Speed = 100% - (1 / 336 x (Time of Carrier Pickup Scan - Time Action Required Email is Sent to Seller))
  4. Response Time (AKA Seller’s 1st Response Time) = 100% - (1 / 7200 x (Time of Seller’s 1st Response - Time of Buyer’s 1st Message for a Given Order or Listing))
  5. Community Engagement = ((seller ratings given/purchases) + (lot ratings given/listings purchased) + (Percentile rank of participation in The Lupulin Exchange Community, calculated as follows: :heart: Received x 100 + Days Visited x 1 + Time Read x 20))/3

Most members will find that their score only changed by 2% or less; however, users with very good (or very bad) community engagement will experience much larger shifts. And that is by design because The Exchange isn’t a store or a hop supplier; it’s a community. Members of the community can find every hop variety, every brand, and every vendor. They can effortlessly compare prices, reviews, shipping speeds, reliability, and more, and they can leverage the platform’s extensive resources & conveniences when they need to offload excess inventory or over-contracted positions. And we have many more exciting tools & features in development that will greatly benefit community members.

But with great power, comes great responsibility. Members are expected to contribute to the community; rate & review experiences and lots, follow community discussions, provide feedback & ideas, speak up when something isn’t right, abide by our rules, and respect our core values. I’m confident that profile scores now do a better job of communicating the performance of each community member.

There is one area that I feel needs further improvement. It was pointed out by @tednjodie here when he said:

Pulling all historical data on response time, ratings etc will skew the numbers to a historical plain, not telling the purchaser what the selling is currently doing.

I agree with Ted’s concern but it’s not an easy problem to solve. Recently, I watched a brewery that previously had a great track record, become unresponsive & unreliable. It turns out they were going out of business. Due to previous history, it took a lot of dropped balls to get their profile score into ugly territory, which means a lot of buyers had bad experiences that could’ve been avoided. The changes described above definitely help this scenario but aren’t a perfect solution. Limiting history (Ted’s suggestion) isn’t a perfect solution because that works well for merchants but not for small brewers who sell hops on LEx sporadically.

The current front-runner of a solution is to also display short-term (perhaps 30-day) history for each metric. Please LMK if you have a better idea!


I agree the best option is showing 2 histories, as noted. The shorter time frame should be at least 2 weeks and I think 1 month is on the long side. Consider 3 weeks for the duration. It needs to be long enough to give a realistic average of what is happening with a shipper and short enough to show the current trend.

One trend you need to avoid on this metric is weekends. I would suggest removing weekends from the rating as Friday orders will skew the numbers in the short term where we would not see the change in the overall history


I’m still not happy with our profile scores and plenty of brewers tell me they aren’t happy either. Here are the issues/problems that I want to solve:

  • Brewers only care about how they can expect a seller to perform today. They are not interested in a seller’s performance months or years ago.
  • Brewers don’t want to be penalized for failing to rate sellers and/or lots.

Here are the changes that I’m considering to further simplify profile scores and address the issues above:

  • Eliminate profile scores for buyers
  • Eliminate the Community Engagement metric
  • For the remaining 4 metrics, calculate averages based on the 5 most recent events only:
  1. Reviews = % of 5 most recent ratings I received that were positive
  2. Reliability = % of my last 5 sales that I canceled
  3. Shipping Speed = 100% if the carrier pickup scan is < 24hrs (excluding weekends) since the Hops Sold email is sent to seller. Subtract 11% for each additional day.
  4. Response Time = 100% - (1 / 7200 x (time of seller’s 1st response - time of buyer’s 1st message for a given order or listing))

I like using a number of most recent events because that incentivizes merchants & growers to always stay on top of things but it also works for the brewery or small farm that only sells occasionally. Using any given time period doesn’t work well for the latter.

How does everyone feel about this approach? Is it fair? Is it ideal? Is 5 the right number of events? Some sellers might have 5 orders in an hour, others might take a month or more to hit 5. I don’t want to make the number of events very high for several reasons:

  • Scores need to accurately reflect current performance
  • Most sellers will only get a review or message for a small % of their orders
  • Occasionally, we notice sellers who suddenly become unreliable and/or unresponsive, so it’s important for sudden drops in performance to show up quickly in these scores.

What do you think? Am I missing anything?

Here’s an overdue update: About a month ago we made the changes proposed in my previous post and it seems to be working quite well. Now, the seller’s most recent performance metrics are prominently displayed on each listing. Here’s an example of a seller with a perfect score:

And here’s an example of a seller from whom I wouldn’t expect a great experience:

Buyers can quickly scan the metrics that are most important to them on any given listing, and the color of each seller performance metric draws attention to any problem areas. As proposed, each metric scores only the most recent 5 events to indicate recent performance. Here’s how each metric is calculated:

  1. Reviews = % of 5 most recent ratings the seller received that were positive
  2. Reliability = % of the last 5 sales that the seller canceled
  3. Shipping Speed = 100% if the carrier pickup scan is < 24hrs (excluding weekends) since the Hops Sold email is sent to seller. Subtract 11% for each additional weekday.
  4. Response Time = 100% - (1 / 7200 x (time of seller’s 1st response to a buyer-initiated message thread - time of buyer’s 1st message)). This applies to both orders & listings. For additional nuances of Response Time, see here.

Also, scores are now company-wide, rather than per individual employee account. And don’t forget, you can always sort or filter listings by shipping speed or overall profile score. Buyers can expect a great experience from the vast majority of sellers on LEx, but profile scores now do a better job of communicating a seller’s recent track record and warning buyers of problems before they make a purchase. Thank you to the numerous brewers who provided feedback related to improving profile scores. These changes were heavily influenced by the results of our most recent survey and the numerous follow-up conversations that I had with respondents.

Hey Brian,

As mentioned in previous comments, the score calculations were revised to reflect current rather than long-term performance because that is what buyers care about.

ie Ted’s comment above:

Pulling all historical data on response time, ratings etc will skew the numbers to a historical plain, not telling the purchaser what the selling is currently doing.

Since only the most recent 5 events are calculated, your score will change quickly after a few new reviews.

1 Like

Feel free to suggest a better formula that solves all of the issues mentioned in this thread.

1 Like

I personally believe that historical feedback should also play a larger role.

1 Like