The Professional Impact Dictionary
The Professional Impact Dictionary
The Law: If your bullet has no verb, you did nothing. If it has no metric, you achieved nothing. If it has no context, nobody cares.
This is not a list of words. This is the formula for translating work into value.
Every bullet on your resume must follow the formula. No exceptions.
The Formula (Non-Negotiable)
Verb → Context → Metric → Outcome
- Verb: What you actually did (Led, Built, Reduced, Scaled)
- Context: What made it hard (System complexity, timeline pressure, scale, constraint)
- Metric: The quantified result ($, %, time saved, volume)
- Outcome: Why it mattered (revenue, cost, speed, risk, retention)
The Anti-Pattern (What 90% Write)
❌ "Responsible for sales" → No verb. No action. Delete.
❌ "Worked on infrastructure" → No metric. No proof. Delete.
❌ "Improved team performance" → No context. By how much? 2%? 200%? Delete.
❌ "Managed projects" → Generic. Every PM writes this. Delete.
The Pattern (What Gets Hired)
✅ "Drove [Verb] enterprise revenue in EMEA [Context] by 30% YoY ($4.5M) [Metric] by launching 3 new partnerships [Outcome]"
✅ "Reduced [Verb] cloud infrastructure costs across 50-service architecture [Context] by 35% ($400K annually) [Metric] by consolidating compute and eliminating zombie resources [Outcome]"
✅ "Built [Verb] data pipeline under 3-week deadline [Context] processing 2M records/day with 99.9% uptime [Metric] enabling real-time analytics for sales team [Outcome]"
The difference: The first set is job duties. The second set is business value. Only the second gets hired.
📖 Dictionary by Category
Leadership & Management
The Problem: Most resumes say "Managed team" or "Led department". This is meaningless without scope and outcome.
The Fix: Add team size, constraints, and measurable outcomes.
| Weak Verb | Strong Replacement | Impact Example |
|---|---|---|
| Managed | Directed / Orchestrated | Orchestrated 3-department collaboration (Eng, Sales, Ops) to launch product 2 months ahead of schedule. |
| Led | Spearheaded / Drove | Spearheaded company-wide migration to Agile, reducing sprint cycle time by 40%. |
| Helped | Enabled / Facilitated | Enabled 12-person team to achieve 95% on-time delivery through daily standups and blocker resolution. |
| Responsible for | Oversaw / Commanded | Oversaw $5M annual budget allocation across 4 product lines with 98% forecast accuracy. |
Real Bullets (Gold Standard):
- Directed cross-functional team of 15 engineers and designers to ship MVPs for 3 SaaS products in 6 months, driving $2M ARR.
- Orchestrated quarterly all-hands planning (100+ attendees), aligning OKRs across 5 departments and reducing conflicting priorities by 60%.
- Spearheaded talent acquisition strategy, hiring 20 mid-to-senior engineers in Q2/2025, reducing avg. time-to-hire from 90 to 45 days.
Stakeholder Coordination: Proving Communication Through Alignment
Most resumes claim "excellent communication" without proof. Stakeholder management is measurable through stakeholder count + meeting cadence + alignment outcomes. The formula: who you coordinated, how often, what changed. This proves coordination ability, not just activity. For complete stakeholder management proof with role-specific examples, see our Stakeholder Management Metrics guide.
Stakeholder Management Verbs:
- Coordinated / Aligned / Facilitated / Resolved
Proof Pattern: [Action] + [Stakeholder count + departments] + [Cadence] + [Outcome: conflict reduced, decision velocity, time saved]
Examples:
- Coordinated 15 cross-functional stakeholders (Engineering, Product, Sales, Legal) through weekly alignment meetings, reducing conflicting priorities by 40% and accelerating feature launch by 3 weeks.
- Facilitated bi-weekly reviews with 8 department heads, cutting roadmap approval time from 6 weeks to 2 weeks and eliminating 12 cross-team blockers.
- Resolved roadmap conflicts between Engineering and Sales through structured sync process, reducing feature scope changes by 60%.
IC Leadership: Influence Without Authority
Individual contributors without direct reports can prove leadership through initiative ownership, RFC adoption, mentorship outcomes, and cross-team influence. The formula: what you started + who adopted it + what improved. This is leadership through technical decision influence, not title-based authority. For comprehensive IC leadership metrics including RFC adoption and de-risking impact, see our Leadership Without Title guide.
IC Leadership Verbs:
- Authored / Proposed / Initiated / Mentored / Identified
Proof Pattern: [Initiative] + [Adoption count: teams/people] + [Outcome: time saved, risk prevented, quality improved]
Examples:
- Authored RFC for microservices migration strategy adopted by 8 engineering teams, reducing deployment time from 2 hours to 15 minutes.
- Mentored 3 junior engineers through structured 1-on-1s and code reviews, all promoted to mid-level within 12 months.
- Built CLI tool for database migrations adopted by 10 engineers, cutting manual setup time from 2 hours to 5 minutes.
- Identified security vulnerability in auth flow, proposed fix adopted across 10 services, preventing potential breach affecting 2M users.
Engineering & Tech
The Problem: Engineering resumes drift into tool lists. Recruiters do not hire “React”. They hire “shipped X that moved Y metric”.
The Fix: Anchor every technical bullet to an outcome: latency, cost, reliability, conversion, developer velocity, security risk, incident rate.
Strong Verbs (Engineering):
| Weak Verb | Strong Replacement | Impact Example |
|---|---|---|
| Worked on | Built / Implemented | Implemented event-driven pipeline for 12 services, reducing job latency by 45%. |
| Fixed | Resolved / Eliminated | Eliminated P1 outage class by adding circuit breakers, cutting incidents by 60%. |
| Improved | Optimized / Accelerated | Optimized Postgres indexes, cutting p95 query time from 900ms to 120ms. |
| Helped | Enabled / Unblocked | Enabled 25 engineers to ship weekly by automating CI checks, saving 30 hrs/week. |
| Maintained | Hardened / Stabilized | Hardened auth flow, reducing account takeover risk via MFA rollout to 92% adoption. |
Real Bullets (Gold Standard):
- Built feature flag system for 8 microservices, enabling safe rollouts and reducing rollback time from 30 minutes to 3 minutes.
- Implemented caching + pagination on search endpoints, increasing throughput by 2.7× and reducing infra cost by 18%.
- Stabilized CI by eliminating flaky tests (top 20), reducing pipeline failures by 55% and improving deploy frequency.
Engineering Impact: Four Performance Dimensions
Engineering value extends beyond "shipped features." Technical impact is measured across performance (latency/throughput), reliability (uptime/error rates), scalability (load capacity), and technical debt reduction (maintainability improvements). Each dimension proves a different type of business value: speed improvements drive conversion, reliability prevents revenue loss, scalability enables growth, and debt reduction accelerates future development velocity. For comprehensive guidance on quantifying each dimension with calculation methodologies and role-specific examples, see our Engineering Metrics guide.
Performance & Reliability Examples:
- Reduced API response time from 850ms to 120ms (86% improvement), improving checkout conversion by 4%.
- Improved system uptime from 99.2% to 99.8% by implementing retry logic and circuit breakers (60% fewer outages).
- Increased throughput from 500 req/s to 1,200 req/s by implementing connection pooling and caching layer.
- Decreased MTTR from 45 minutes to 12 minutes by building automated rollback and health-check systems.
Scalability & Debt Reduction Examples:
- Redesigned data pipeline to handle 10x traffic growth (from 5k to 50k requests/minute) without infrastructure cost increase.
- Refactored legacy authentication system, reducing bug fix time by 60% (from 8 hours to 3 hours average).
- Increased test coverage from 45% to 85%, reducing production incidents by 40%.
Sales & Revenue
The Problem: Sales bullets often sound like “did activity” (calls, meetings). Activity is not value. Value is pipeline, win rate, ARR, renewal, retention.
The Fix: Tie activity to money outcomes and deal context: segment, cycle length, ACV, objections, stakeholders.
Strong Verbs (Sales):
| Weak Verb | Strong Replacement | Impact Example |
|---|---|---|
| Called | Prospected / Qualified | Qualified 120 inbound leads/month, improving SQL rate from 18% to 31%. |
| Sold | Closed / Converted | Closed $480K ARR across 6 enterprise accounts in Q4, exceeding quota by 22%. |
| Negotiated | Secured / Won | Secured 2-year renewal at 15% uplift, preventing churn of $250K ARR. |
| Presented | Pitched / Influenced | Influenced 5-stakeholder buying group, shortening cycle from 90 to 62 days. |
Real Bullets (Gold Standard):
- Closed $1.2M ARR across mid-market accounts by rebuilding discovery workflow and improving close rate from 14% to 21%.
- Secured renewal of top 3 accounts (total $900K ARR) by redesigning adoption plan, increasing product usage by 35%.
Sales Efficiency: Beyond Quota Attainment
Quota attainment is the baseline—the strategic signal comes from how you achieved it. Sales excellence is measured across three efficiency dimensions: pipeline velocity (how fast deals move through stages), win rate (conversion percentage vs. company benchmark), and deal size progression (average contract value growth over time). The formula: quota percentage + ranking + efficiency metric (win rate, cycle time, or deal size growth). This proves systematic execution, not just hustle or luck. For comprehensive sales metrics including pipeline coverage, CAC payback, and multi-stakeholder engagement patterns, see our Sales Metrics guide.
Operations & Efficiency
The Problem: Ops resumes bury the lede. Operations is about throughput, waste, cycle time, quality, and reliability.
The Fix: Use verbs that imply systems thinking and measurable throughput improvements.
Strong Verbs (Ops):
| Weak Verb | Strong Replacement | Impact Example |
|---|---|---|
| Handled | Streamlined / Standardized | Standardized SOPs across 4 sites, reducing rework by 28% and improving QA pass rate. |
| Managed | Coordinated / Orchestrated | Orchestrated vendor transition under 2-week deadline, avoiding stockout risk. |
| Improved | Reduced / Eliminated | Reduced order-to-ship time from 4.2 days to 2.6 days by redesigning pick/pack flow. |
| Tracked | Instrumented / Monitored | Instrumented KPI dashboard, enabling weekly root-cause reviews and 12% cost reduction. |
Real Bullets (Gold Standard):
- Reduced monthly operating cost by $65K by renegotiating vendor contracts and consolidating tooling across 3 teams.
- Standardized onboarding workflow, cutting time-to-productivity from 14 days to 5 days for 40 new hires.
Operations Excellence: SLA Compliance & Process Efficiency
The strongest operations resumes prove reliability at scale. Operational value is measured across three dimensions: SLA compliance (service reliability and target achievement), cycle time & throughput (speed and capacity), and defect reduction (quality under pressure). The formula: maintain quality + improve speed + increase capacity = operational leverage. Operations metrics prove execution reliability—they show you can balance speed and quality without breaking things. For comprehensive methodologies on SLA metrics, error rate reduction, cost per unit optimization, and capacity planning with role-specific examples, see our Operations Metrics guide.
SLA & Quality Examples:
- Maintained 99.8% SLA compliance across 12,000 monthly service requests while reducing resolution time by 35%.
- Reduced order error rate from 2.3% to 0.4% by implementing automated validation checks.
- Improved platform uptime from 98.5% to 99.7% by implementing proactive monitoring and automated failover.
Throughput & Efficiency Examples:
- Increased daily processing capacity from 500 to 1,200 orders without additional headcount through workflow automation.
- Reduced cost per transaction from $4.20 to $2.80 through process redesign and automation.
- Cut average ticket resolution time from 48 hours to 12 hours while maintaining 99.5% SLA compliance.
Partnerships & Business Development
The Problem: BizDev resumes confuse deal-signing with deal-making. "Established 12 strategic partnerships" proves you can negotiate contracts, not that you built revenue engines.
The Fix: Partnerships are measured by revenue leverage: partner-sourced pipeline, co-sell wins, integration adoption, channel ROI, and activation rate.
Strong Verbs (Partnerships/BizDev):
| Weak Verb | Strong Replacement | Impact Example |
|---|---|---|
| Partnered | Activated / Launched | Launched AWS co-sell partnership generating $6.3M in partner-sourced pipeline. |
| Established | Built / Scaled | Built channel partner network driving $3.9M in partner-sourced revenue within 9 months. |
| Managed | Enabled / Accelerated | Enabled 14 of 16 partners to generate pipeline (88% activation vs. 35% industry avg). |
| Negotiated | Secured / Closed | Secured Salesforce integration adopted by 41% of enterprise customers. |
Real Bullets (Gold Standard):
- Built AWS co-sell partnership generating $6.3M in partner-sourced pipeline (31% of enterprise pipeline) with 54% win rate, 22 points higher than direct sales.
- Reduced partner time-to-first-deal from 13 months to 5.2 months (60% improvement) through structured enablement program, accelerating channel ROI by $2.1M in Year 1.
Partnerships as Leverage: Proving Multiplier Effect
The strongest partnerships resumes prove revenue multipliers, not relationship counts. Partnership value is measured across three layers: revenue attribution (partner-sourced vs. partner-influenced pipeline and closed deals), channel efficiency (partner activation rate, time-to-first-deal, co-sell win rate), and ecosystem scale (integration adoption, marketplace performance, developer engagement). The formula: signed partnerships + activation rate + revenue contribution + efficiency metric (cycle time, win rate, or ROI). This proves systematic channel building, not just handshake collection. For comprehensive partnership metrics including integration adoption patterns and channel ROI calculation methodologies, see our Partnerships & BizDev Metrics guide.
🛠️ The "Weak vs Strong" Translator
| Weak Word | Strong Replacement | Context Example |
|---|---|---|
| Helped | Facilitated | Facilitated cross-departmental workshops... |
| Worked on | Executed | Executed migration of 5TB database... |
| Managed | Orchestrated | Orchestrated launch of 3 products... |
How to Use This Dictionary (Fast)
If you want this to actually improve your resume (not just look smart), use it like a filter:
- Pick one bullet from your resume that feels “fine” but not impressive.
- Identify the verb you used. If it is weak (“Helped”, “Worked on”, “Responsible for”), replace it.
- Add context: scope, constraints, stakeholders, system size, budget, region, timeline.
- Add a metric: dollars, %, hours, users, volume, cycle time, error rate, quality score, tickets.
- Add business impact: revenue, cost, risk, speed, retention, reliability, customer outcomes.
If you can do steps 2–5, you are writing “Gold”. If you cannot, the bullet is fluff and should be removed or rewritten.
Upgrade weak bullets into quantified, high-impact bullets
Building Your Metrics Inventory (Data Mining)
The #1 objection to quantified bullets? "I don't have metrics." But the data exists—in your calendar, tickets, dashboards, retros, and feedback. The metrics aren't missing; you haven't extracted them yet.
The 8 hidden data sources for every role:
- Calendar → stakeholder count, meeting frequency, cross-functional breadth
- Ticket systems → throughput, resolution time, backlog reduction
- Dashboards → user growth, conversion rates, performance trends
- Git/code → commits, PRs merged, code reviews completed
- Performance reviews → manager-cited achievements, goal attainment
- Project docs → budget size, team size, timeline delivery
- Retrospectives → efficiency gains, velocity increases, time saved
- Email/Slack praise → client outcomes, peer recognition with specific numbers
Build a quarterly metrics inventory from these sources—you'll never scramble for proof again. For complete step-by-step data mining methodology across all 8 sources, see our metrics inventory system.
Metric Bank (Steal These Categories)
If you “don’t have metrics”, you usually have the wrong mental model. You might not have revenue, but you almost always have scope and constraints.
High-Impact Bullet Templates (Fill the Blanks)
If you want a bullet that reads like a recruiter magnet, use templates. Do not freestyle.
Template 1: Speed
Accelerated [process/system] for [scope], cutting [metric] from [baseline] to [result], enabling [business outcome].
Template 2: Cost
Reduced [cost line item] by [percent/$] by [lever], saving [amount] annually while maintaining [quality/reliability].
Template 3: Quality
Improved [quality metric] from [baseline] to [result] by [action], reducing [risk/defects/rework] by [metric].
Template 4: Scale
Scaled [system/program] to [new volume/users/regions] without adding headcount by [automation/standardization], improving [metric] by [result].
Template 5: Risk
Mitigated [risk] across [scope] by [control/process], reducing [incidents/findings/churn] by [metric].
Common Resume Verb Traps (Avoid These)
Some verbs are not “wrong”, but they are expensive because they force the recruiter to guess what you did. Replace them unless you immediately follow with proof.
Customer Success & Support (Proof Over Politeness)
The Problem: Support resumes list “handled tickets” and “helped customers”. That reads like basic competence.
The Fix: Show outcomes: CSAT, first response time, resolution time, deflection, churn risk reduced, escalations prevented.
Strong Verbs (CS/Support):
| Weak Verb | Strong Replacement | Impact Example |
|---|---|---|
| Helped | Resolved / Unblocked | Resolved 180 tickets/month at 96% CSAT, reducing escalations by 35%. |
| Answered | Diagnosed / Triaged | Triaged P1 issues and drove handoffs, cutting time-to-resolution by 28%. |
| Explained | Educated / Enabled | Enabled customers to adopt feature, increasing activation by 14%. |
| Handled | De-escalated / Retained | Retained at-risk accounts by rebuilding onboarding, protecting $220K ARR. |
Real Bullets (Gold Standard):
- Reduced average first response time from 6h to 45m by redesigning triage rules and staffing schedule.
- Built 25-article help center and macros, deflecting 18% of inbound tickets and freeing 12 hrs/week.
Customer Success: Retention, Expansion & Net Revenue
The strongest Customer Success resumes don't just show "customer satisfaction"—they prove revenue retention and account growth. CS value is measured across three core dimensions: retention (churn reduction, renewal rate), expansion (upsell rate, expansion MRR, cross-sell adoption), and Net Retention Rate (the ultimate CS metric showing existing customer revenue growth). The formula: portfolio size + retention metric + expansion outcome = CS revenue impact. NRR above 100% proves you're not just preventing churn—you're growing accounts without new customer acquisition. For comprehensive methodologies on calculating NRR, reducing churn, measuring health score improvement, and segment-specific CS metrics (SMB vs. Enterprise vs. PLG), see our Customer Success Metrics guide.
Retention & NRR Examples:
- Maintained 121% NRR across $8M portfolio by identifying expansion opportunities in 40% of accounts.
- Reduced monthly gross churn from 6.2% to 2.8% by launching proactive health monitoring across 200+ accounts.
- Achieved 118% NRR across enterprise segment ($250K+ ARR), driven by 40% upsell rate and 12% cross-sell adoption.
Expansion & Account Growth Examples:
- Generated $360K in annual expansion MRR by identifying upsell triggers and delivering targeted QBRs to 50 enterprise accounts.
- Improved avg. customer health score from 62 to 78 (scale 0-100) by launching automated engagement campaigns and usage analytics.
- Reduced enterprise onboarding time from 45 to 21 days, increasing 90-day retention from 82% to 94%.
Product Management (Outcomes Over Outputs)
The Problem: PM resumes list features shipped ("Launched 10 features in Q2") without proving those features drove business value. Shipping volume is an output metric—it proves activity, not impact.
The Fix: Every PM bullet must connect features to measurable outcomes: adoption rate (proof users wanted it), engagement lift (proof it stuck), revenue impact (proof it drove business value), or efficiency gains (proof it improved the product development process itself). The formula: feature shipped + adoption/engagement metric + business outcome.
Strong Verbs (Product Management):
| Weak Verb | Strong Replacement | Impact Example |
|---|---|---|
| Managed | Led / Prioritized | Led roadmap prioritization for 3 products, shipping features that increased DAU by 15%. |
| Shipped | Launched / Delivered | Launched onboarding redesign with 67% adoption rate, improving activation by 18 pp. |
| Worked on | Defined / Drove | Defined success metrics for checkout flow, driving 23% conversion improvement. |
| Coordinated | Aligned / Enabled | Enabled 8-person team to ship search improvements, increasing CTR by 27%. |
Real Bullets (Gold Standard):
- Launched collaborative workspace feature, increasing DAU by 12% (from 45K to 50.4K users) and weekly session frequency by 18%.
- Shipped premium tier features, driving $1.8M in incremental ARR within first 6 months (22% of total new revenue).
- Defined product analytics framework adopted by 5 product teams, improving feature instrumentation coverage from 34% to 89%.
Product Management: The Four Outcome Categories
Product managers prove value through adoption (did users use it?), engagement (did it stick?), revenue (did it drive business value?), and efficiency (did you ship faster or smarter?). The strongest PM bullets pair feature context (what you shipped and why) with outcome metrics (what changed because you shipped). Adoption metrics prove product-market fit at the feature level. Engagement metrics prove lasting value, not novelty spikes. Revenue metrics tie product work to company financial goals. Efficiency metrics show you improved how the team works, not just what they built. For comprehensive guidance on quantifying PM impact across all four dimensions with calculation methodologies and senior vs. junior PM differentiation, see our Product Manager Resume Metrics guide.
Adoption & Engagement Examples:
- Launched in-app messaging with 67% adoption rate among active users within 30 days, exceeding 50% target.
- Redesigned onboarding flow, increasing activation rate from 34% to 52% (18 pp lift) and reducing time-to-first-value from 5 days to 1.5 days.
- Shipped personalized content feed, improving Day-30 retention from 28% to 41% (13 pp lift) across 120K user cohort.
Revenue & Efficiency Examples:
- Rebuilt checkout flow, increasing trial-to-paid conversion from 14% to 19% (5 pp lift), adding $420K ARR.
- Streamlined feature flagging process, reducing average time-to-production from 4 weeks to 1.5 weeks across engineering team.
- Launched self-serve admin tools, reducing internal support tickets by 38% (2,400 fewer tickets/month).
Data & Analytics (Make the Insight Measurable)
The Problem: Analytics bullets often read like “made dashboards”. Dashboards are output. Business decisions are value.
The Fix: Tie analysis to decisions and outcomes: revenue protected, costs reduced, forecast accuracy improved, experiments shipped.
Strong Verbs (Data):
| Weak Verb | Strong Replacement | Impact Example |
|---|---|---|
| Created | Instrumented / Operationalized | Operationalized KPI dashboard for 15 stakeholders, improving decision cadence. |
| Analyzed | Identified / Diagnosed | Identified churn driver and launched fix, reducing churn risk by 9%. |
| Reported | Forecasted / Modeled | Modeled demand and improved forecast accuracy from 72% to 88%. |
Real Bullets (Gold Standard):
- Identified 3 conversion drop-offs via funnel analysis, driving fixes that increased signup-to-activation by 6%.
- Automated weekly reporting pipeline, saving 10 hours/week and enabling faster exec reviews.
Data Science: Model Performance to Business Value
The strongest data science resumes don't list model accuracy in isolation—they prove deployed models drove measurable business outcomes. DS value is measured across four layers: model performance (accuracy, precision, recall with baseline comparison), deployment scale (predictions per day, users served, production uptime), business impact (revenue driven, costs reduced, decision quality improved), and infrastructure efficiency (training time reduced, serving costs optimized). The formula: model quality + production scale + business outcome = DS credibility. Model performance proves technical competence—but only when paired with deployment context. Production deployment at scale proves engineering rigor, not just research skill. Business impact metrics separate senior DS (models that drive decisions) from junior DS (models that sit in notebooks). For comprehensive guidance on quantifying DS work including calculation methodologies for deployment metrics, A/B test lift analysis, and distinguishing research contributions from production impact, see our Data Science Resume Metrics guide.
Model Performance & Deployment Examples:
- Built churn prediction model with 0.87 AUC-ROC (vs. 0.71 heuristic baseline) and 82% precision at 65% recall, deployed to 340K user base.
- Deployed recommendation engine serving 2.3M predictions/day to 450K active users with 99.6% uptime and <50ms p95 latency.
- Improved demand forecasting accuracy from 76% to 89% (13 pp lift), reducing RMSE from 12.4 to 7.8 units.
Business Impact & Efficiency Examples:
- Deployed dynamic pricing model increasing revenue by $3.2M annually (8% lift) across 45K transactions/month.
- Built fraud detection model reducing losses by $1.8M/year while decreasing false positive review queue by 42%.
- Rebuilt feature engineering pipeline reducing model training time from 6 hours to 22 minutes, enabling daily retraining vs. weekly.
Marketing & Growth (Prove the ROI)
The Problem: Marketing resumes claim "creative campaigns" and "brand awareness" without tying them to pipeline, revenue, or conversion.
The Fix: Marketing is a revenue function. Every bullet should connect activity to business metrics: CAC, LTV, conversion rate, pipeline, MQLs, retention.
Strong Verbs (Marketing):
| Weak Verb | Strong Replacement | Impact Example |
|---|---|---|
| Ran | Launched / Executed | Launched email nurture campaign, increasing MQL→SQL conversion by 18%. |
| Created | Designed / Built | Built landing page A/B test framework, improving conversion from 2.1% to 3.4%. |
| Managed | Optimized / Scaled | Scaled paid search spend from $50K to $200K/month while reducing CAC by 22%. |
| Promoted | Drove / Generated | Drove 2,400 event registrations via multi-channel campaign, converting 320 SQLs. |
Real Bullets (Gold Standard):
- Reduced customer acquisition cost (CAC) from $180 to $125 by optimizing ad targeting and landing page conversion, saving $220K annually.
- Generated $1.8M pipeline via content marketing program (SEO + lead magnets), increasing organic traffic by 140% YoY.
Marketing Attribution: Connecting Spend to Revenue
The strongest marketing resumes don't just show CAC reduction—they prove attribution clarity. Marketing value is measured through three layers: revenue linkage (CAC, LTV:CAC ratio, ROAS), conversion efficiency (funnel stage conversion rates, cost per conversion, pipeline contribution), and audience quality (qualified traffic growth, engagement tied to conversions). The formula: specify your attribution model (first-touch, multi-touch, last-touch) + show the economic impact + prove funnel optimization. For complete methodologies on calculating marketing ROI, attribution models, and efficiency metrics across performance and brand marketing, see our Marketing Metrics guide.
Content & Community: Engagement Over Vanity
The strongest content and community resumes don't list follower counts—they prove engagement quality and conversion outcomes. Content/community value is measured across three layers: engagement stickiness (DAU/MAU ratio, return visitor rate), interaction depth (engagement rate vs. platform benchmarks, comments-to-views ratio, time on page), and business conversion (content-attributed leads, community-to-customer conversion). The formula: audience size + engagement metric (DAU/MAU or engagement rate) + conversion outcome = content ROI proof. DAU/MAU above 20% proves users return frequently, not just sign up and ghost. Engagement rate above platform benchmarks proves content resonates. For comprehensive methodologies on calculating DAU/MAU, platform-specific engagement benchmarks, content attribution, and role-specific metrics for Content Marketing, Social Media, and Community Management, see our Content & Community Metrics guide.
Engagement & Retention Examples:
- Grew community DAU/MAU from 12% to 31% by implementing daily discussion prompts and gamification system across 15K members.
- Maintained 6.2% avg. engagement rate across LinkedIn (4x industry avg.) by shifting from promotional to education-first content strategy.
- Increased return visitor rate from 22% to 58% by launching weekly newsletter series and topic-based content hubs.
Content Conversion & Attribution Examples:
- Achieved 8.3% blog-to-signup conversion rate by creating bottom-of-funnel guides and optimizing CTAs, driving 2,400 qualified leads in 6 months.
- Generated $4.2M in content-influenced pipeline by creating targeted guides for each buying stage, tracked via HubSpot attribution.
- Grew UGC from 15 posts/month to 320 posts/month by launching creator incentive program and featured contributor series.
Finance & Accounting (Show the Controls)
The Problem: Finance bullets say "prepared reports" or "managed accounts". Reports are not value. Accurate forecasts, risk mitigation, and cost savings are value.
The Fix: Tie finance work to accuracy, controls, audit findings, cost reduction, variance, compliance, and stakeholder decision-making.
Strong Verbs (Finance):
| Weak Verb | Strong Replacement | Impact Example |
|---|---|---|
| Prepared | Delivered / Produced | Delivered monthly variance analysis, enabling CFO to reallocate $2M budget faster. |
| Reconciled | Validated / Audited | Audited GL transactions, identifying $340K discrepancy and implementing SOX controls. |
| Tracked | Monitored / Forecasted | Forecasted quarterly revenue with 98% accuracy, supporting board decision-making. |
| Managed | Controlled / Optimized | Optimized AP workflow, reducing Days Payable Outstanding (DPO) from 52 to 38 days. |
Real Bullets (Gold Standard):
- Reduced close cycle from 10 days to 6 days by automating journal entries and implementing new reconciliation process across 4 entities.
- Identified $1.2M in cost-saving opportunities via spend analysis, driving vendor consolidation and contract renegotiation.
Finance & Analytics: Model Accuracy & Decision Impact
The strongest finance and analytics resumes prove analytical work drives decisions. Financial value is measured across three layers: model accuracy (forecast variance, prediction precision), cost identification (opportunities surfaced, risks flagged), and decision enablement (adoption rate, actions taken, business outcomes). The formula: accurate models + identified opportunities + decisions informed = analytical credibility. Finance metrics prove you turn data into action—they show your analysis doesn't sit in decks, it drives resource allocation and strategic choices. For comprehensive methodologies on forecast accuracy, attribution analysis, scenario planning, and decision impact measurement with calculation frameworks, see our Finance & Analytics Metrics guide.
Model Accuracy & Forecasting Examples:
- Maintained forecast accuracy within 5% variance across 12 consecutive quarters, enabling confident capital planning.
- Built revenue prediction model with 92% accuracy that reduced inventory carrying costs by $400K.
- Improved budget variance from 18% to 7% through enhanced baseline modeling and assumption tracking.
Decision Impact & Cost Identification Examples:
- Financial model informed $15M acquisition decision (board-approved based on NPV and payback analysis).
- Identified $1.2M in vendor consolidation opportunities through spend analysis across 200 contracts.
- Built attribution model that identified $2M in undervalued marketing channels, informing budget reallocation.
Revenue Impact for Non-Sales Roles (Cost, Efficiency, Speed)
The Problem: Most non-sales professionals believe revenue metrics only apply to sales teams. But revenue isn't just generated at the point of sale—it's affected by cost structure (savings = profit), operational efficiency (capacity = revenue potential), and time-to-market (speed = earlier monetization).
The Fix: Prove revenue contribution through three impact levers: cost savings (direct profit improvement), efficiency gains (increased output without proportional cost), and time-to-market acceleration (capturing revenue earlier). Every function affects at least one of these dimensions.
Revenue Measurement Categories:
| Impact Type | What It Measures | Proof Pattern |
|---|---|---|
| Cost Savings | Expense reduction, waste avoided | (Baseline Cost - New Cost) × Volume × Timeframe |
| Efficiency | Output per input, capacity gains | Time Saved × Hourly Cost × Frequency × Annual Multiplier |
| Time-to-Market | Revenue acceleration | (Revenue Per Month × Months Saved) + Market Share Protected |
Cost Savings Examples:
- Renegotiated vendor contracts, reducing annual software licensing costs by $120k (18% savings).
- Automated invoice processing, eliminating 25 hours/week of manual work ($65k annual labor savings).
- Consolidated 3 legacy systems into unified platform, reducing operational costs by $200k/year.
Efficiency & Speed Examples:
- Reduced customer onboarding time from 6 weeks to 10 days, enabling 3x faster revenue recognition.
- Improved deployment pipeline speed by 40%, allowing team to ship 2 additional features per quarter.
- Accelerated product launch by 6 weeks, capturing $300k in early-quarter revenue.
For complete methodologies on calculating and defending non-sales revenue impact, including role-specific translation examples for Engineering, Product, Operations, Marketing, and Design, see our Revenue Metrics for Non-Sales Roles guide.
Healthcare & Clinical (Patient Outcomes + Compliance)
The Problem: Healthcare resumes list duties ("Administered medications", "Documented care") without showing patient outcomes, safety improvements, or compliance.
The Fix: Healthcare is measured by outcomes, safety, compliance, throughput, and patient satisfaction. Every bullet should tie clinical work to measurable improvement.
Strong Verbs (Healthcare):
| Weak Verb | Strong Replacement | Impact Example |
|---|---|---|
| Provided | Delivered / Coordinated | Coordinated care for 18-bed unit, achieving 94% patient satisfaction (HCAHPS). |
| Administered | Managed / Optimized | Managed medication administration protocol, reducing errors by 35% in 6 months. |
| Assisted | Supported / Enabled | Enabled discharge readiness for 120 patients/month, cutting readmission by 12%. |
| Documented | Maintained / Ensured | Ensured 100% EHR compliance during Joint Commission audit with zero findings. |
Real Bullets (Gold Standard):
- Reduced central line infection rate from 2.1 to 0.4 per 1,000 line-days by leading bundle compliance training for 45-nurse unit.
- Improved patient throughput by 18% (ED to inpatient bed) by redesigning triage workflow and staffing model.
Education & Training (Learning Outcomes, Not Just Teaching)
The Problem: Education resumes say "Taught classes" or "Developed curriculum". Teaching is the activity. Learning is the value.
The Fix: Show learning outcomes: test scores, pass rates, engagement, skill acquisition, behavior change, retention, graduation rates.
Strong Verbs (Education):
| Weak Verb | Strong Replacement | Impact Example |
|---|---|---|
| Taught | Delivered / Facilitated | Delivered AP Biology to 90 students, achieving 85% pass rate (vs. 68% district). |
| Developed | Designed / Built | Built project-based curriculum, increasing student engagement scores by 24%. |
| Managed | Led / Coordinated | Led IEP team for 12 students, improving goal attainment from 62% to 89%. |
| Tutored | Coached / Mentored | Mentored 15 at-risk students, improving semester GPA from 2.1 to 2.9 average. |
Real Bullets (Gold Standard):
- Increased AP Calculus pass rate from 71% to 88% by redesigning problem sets and implementing weekly review sessions.
- Reduced absenteeism by 22% through student engagement program, improving overall course completion rate by 14%.
Legal & Compliance (Risk Mitigation + Audit Results)
The Problem: Legal resumes say "Reviewed contracts" or "Ensured compliance". Reviewing is activity. Preventing risk, passing audits, and closing deals are outcomes.
The Fix: Show risk prevented, contracts closed, audit findings, dispute resolution speed, compliance rate, regulatory approvals.
Strong Verbs (Legal):
| Weak Verb | Strong Replacement | Impact Example |
|---|---|---|
| Reviewed | Negotiated / Finalized | Negotiated 28 vendor contracts, reducing liability exposure and saving $180K/year. |
| Drafted | Prepared / Delivered | Prepared merger documentation for $50M acquisition, closing deal 3 weeks early. |
| Advised | Guided / Protected | Guided product team through GDPR compliance, avoiding regulatory penalties. |
| Managed | Resolved / Mitigated | Resolved 14 disputes via mediation, avoiding $2.3M litigation cost. |
Real Bullets (Gold Standard):
- Achieved zero audit findings during SOX 404 compliance review by implementing new internal controls across 6 subsidiaries.
- Reduced contract cycle time from 45 days to 18 days by standardizing templates and redline process.
Legal & Compliance: Risk Reduction as Measurable Value
Legal and compliance professionals face a unique challenge: proving you prevented disasters is harder than proving you generated revenue. But prevention is measurable when you establish baselines. The strongest legal resumes quantify contract cycle time (enablement speed), audit findings (control effectiveness), regulatory compliance rates (risk mitigation), and incident response time (operational reliability). The formula: baseline metric (before your intervention) + improved metric (after) + business impact (cost avoided, deals accelerated, or penalties preventing). This proves you don't just review contracts—you accelerate business velocity while reducing exposure. For comprehensive methodologies on quantifying legal impact including contract turnaround metrics, audit finding reduction, and regulatory penalty avoidance calculations, see our Legal & Compliance Resume Metrics guide.
Risk Reduction Examples:
- Reduced contract review cycle from 14 days to 7 days by creating standardized playbooks, enabling $5M in deals to close 50% faster.
- Maintained zero regulatory penalties for 36 consecutive months after implementing compliance program (previous 3 years: $500K in fines).
- Reduced audit findings from 22 to 3 (86% reduction) over 18 months by implementing automated compliance monitoring.
Human Resources & Talent (Hiring Speed + Retention)
The Problem: HR resumes say "Recruited candidates" or "Managed onboarding". Recruiting is not the win. Time-to-fill, quality of hire, retention, and offer acceptance are wins.
The Fix: HR is measured by hiring speed, retention, cost-per-hire, diversity, performance ratings of new hires, and employee satisfaction.
Strong Verbs (HR):
| Weak Verb | Strong Replacement | Impact Example |
|---|---|---|
| Recruited | Sourced / Hired | Hired 32 engineers in Q2, reducing time-to-fill from 67 to 42 days. |
| Managed | Streamlined / Optimized | Optimized onboarding program, improving 90-day retention from 82% to 91%. |
| Conducted | Led / Facilitated | Led diversity hiring initiative, increasing underrepresented hires by 35%. |
| Handled | Resolved / Mediated | Resolved 18 employee relations cases, maintaining zero legal escalations. |
Real Bullets (Gold Standard):
- Reduced cost-per-hire from $8,200 to $5,900 by building employee referral program that sourced 40% of new hires.
- Improved new hire performance ratings (first-year average) from 3.2 to 3.8/5.0 by redesigning interview rubrics and training.
QA & Testing (Bug Prevention Impact)
The Problem: QA resumes say "Executed test cases" or "Found bugs". Test execution is activity. Quality improvement—fewer production defects, faster releases, higher test coverage—is value.
The Fix: Show defect density reduction, test coverage growth, automation ROI, release quality (zero-defect releases), and testing efficiency improvements.
Strong Verbs (QA/Testing):
| Weak Verb | Strong Replacement | Impact Example |
|---|---|---|
| Tested | Validated / Verified | Validated 200+ test cases per sprint with 95% bug detection rate. |
| Found | Detected / Identified | Detected 92% of critical bugs in QA (23 of 25), reducing defect escape rate to 8%. |
| Automated | Built / Implemented | Built automated test suite covering 70% of regression tests, reducing test time by 65%. |
| Ran | Executed / Optimized | Optimized test suite runtime from 90 minutes to 15 minutes via parallel execution. |
Real Bullets (Gold Standard):
- Reduced production defects from 50 per release to 15 (70% reduction) over 12 months through expanded regression coverage and exploratory testing.
- Increased automated test coverage from 55% to 82% over 18 months, reducing regression testing time by 65% (from 40 hours to 14 hours per release).
QA & Testing: Prevention Metrics That Prove Quality
The strongest QA resumes don't count bugs found—they prove bug prevention impact. QA value is measured across four dimensions: defect density (bugs per release trending downward), defect escape rate (percentage of bugs caught in QA vs. production), test coverage (code/feature coverage percentage), and automation ROI (time saved or velocity gained through automated testing). The formula: baseline defect count + improved defect count + test efficiency improvement = measurable quality gains. Defect escape rate below 10% proves your testing process is catching problems before customers see them. Test automation that reduces regression time from days to hours proves you're enabling faster releases. For comprehensive methodologies on quantifying QA impact including defect detection efficiency, flaky test reduction, and release quality metrics, see our QA & Testing Resume Metrics guide.
Defect Prevention & Coverage Examples:
- Reduced defect escape rate from 25% to 8% by implementing risk-based testing and edge case scenario coverage.
- Achieved 95% code coverage across critical services, enabling continuous deployment with <15min test suite execution.
- Maintained defect escape rate below 10% for 8 consecutive releases (industry average: 20-30%).
Automation & Efficiency Examples:
- Automated 400 regression test cases, reducing full regression cycle from 5 days to 8 hours and enabling weekly releases (previously monthly).
- Reduced flaky test rate from 15% to 3% by refactoring timing-dependent waits, improving CI pipeline reliability from 70% to 95%.
- Built CI/CD-integrated test suite executing in <10 minutes, enabling 50+ daily deployments vs. previous 2-3 weekly releases.
The Verb Selection Framework: Function-Specific Taxonomy
Generic verbs kill resumes before recruiters finish reading the first bullet. "Managed," "worked on," "helped with"—these apply to every role and signal nothing about your domain expertise or seniority level.
The verb you choose is the first signal of your functional expertise. It tells recruiters:
- What you do (your function: engineering, sales, operations)
- How senior you are (your decision-making level)
- How technical you are (your depth of expertise)
Why Function-Specific Verbs Matter
Every function has its own action vocabulary:
- Engineers architect, optimize, and scale systems
- Sales professionals close, negotiate, and convert deals
- Operations managers orchestrate, streamline, and coordinate workflows
- Product managers launch, prioritize, and iterate on features
- Marketers drive, optimize, and generate measurable outcomes
Using the wrong verb category immediately flags you as either:
- Junior/inexperienced (using support verbs like "helped" or "assisted")
- Outside your claimed function (an engineer saying "managed stakeholders" instead of "architected systems")
- Generic/commodity ("managed projects" could be anyone)
The Core Verb Categories by Function
Engineering: Build, Optimize, Scale
Core verbs: Architected, Engineered, Deployed, Optimized, Refactored, Migrated, Automated, Debugged, Scaled, Instrumented
Avoid: Worked on, Fixed, Improved (too generic)
Why these work: They signal technical execution and architectural thinking, not just code writing
Sales: Acquire, Negotiate, Retain
Core verbs: Closed, Negotiated, Sourced, Converted, Landed, Cultivated, Prospected, Qualified, Upsold, Retained
Avoid: Called, Sold, Presented (activity, not outcomes)
Why these work: They prove revenue generation and deal execution, not just activity logging
Operations: Coordinate, Streamline, Optimize
Core verbs: Orchestrated, Streamlined, Coordinated, Standardized, Optimized, Implemented, Automated, Monitored, Resolved, Scaled
Avoid: Handled, Managed, Tracked (administrative, not strategic)
Why these work: They signal process management and efficiency improvement, not just task completion
Product Management: Launch, Prioritize, Deliver
Core verbs: Launched, Prioritized, Validated, Defined, Delivered, Partnered, Analyzed, Iterated
Avoid: Managed, Worked on, Helped (no ownership signal)
Why these work: They show product ownership and strategic decision-making
The Ownership Hierarchy: Senior vs Junior Verbs
Verbs signal your level in the org chart:
Junior/Subordinate Verbs:
- Helped, Assisted, Supported, Participated in, Contributed to, Worked on
Senior/Owner Verbs:
- Led, Architected, Drove, Orchestrated, Launched, Negotiated, Built, Scaled
If you're applying for a senior role but your resume uses subordinate verbs, you've positioned yourself as support staff before the recruiter reads your metrics.
How to Choose the Right Verb
Step 1: Identify your core function (Engineering? Sales? Ops? Product? Marketing?)
Step 2: Match your bullet to the type of work within that function:
- Building something new? → Built, Engineered, Developed, Launched
- Improving existing systems? → Optimized, Streamlined, Refactored, Enhanced
- Managing scale or complexity? → Scaled, Orchestrated, Coordinated, Managed
- Acquiring or converting? → Closed, Converted, Sourced, Landed
Step 3: Check ownership level:
- Did you lead this initiative? → Use ownership verbs (Led, Drove, Architected)
- Did you support someone else? → Use enabling verbs (Enabled, Unblocked, Facilitated)
Critical Rule: Never use subordinate verbs ("helped," "assisted") unless you're genuinely describing support work. If you drove the outcome, use an ownership verb.
Common Verb Mismatches (And How to Fix Them)
Mismatch #1: Technical Role Using Non-Technical Verbs
❌ "Managed database performance"
✅ "Optimized PostgreSQL queries, reducing DB load by 40%"
Why: Engineers optimize and architect—"managed" is a project management verb
Mismatch #2: Sales Role Using Activity Verbs
❌ "Called 100 prospects weekly"
✅ "Prospected 100 leads weekly, converting 15% to qualified opportunities"
Why: Prospecting signals sales methodology, "called" signals telemarketing activity
Mismatch #3: Operations Role Using Passive Ownership
❌ "Responsible for logistics coordination"
✅ "Orchestrated logistics across 4 warehouses, improving on-time delivery to 98%"
Why: "Orchestrated" signals complex multi-stakeholder coordination—"responsible for" signals job description, not achievement
For a complete breakdown of function-specific verb taxonomies with before/after examples across Engineering, Sales, Operations, Product, Marketing, Finance, HR, and Design roles, see our comprehensive Action Verbs by Function guide.
How to Use This Dictionary
Bookmark the Professional Impact Dictionary and return whenever you're writing bullets.
The Dictionary Usage Framework: The dictionary contains 500+ formulas, but formulas are not sentences to copy—they're structures to adapt. Navigate by function first (Engineering, Sales, Marketing, Operations), then by seniority level (IC, Manager, Director). Select 3-5 formulas where you have real data to insert. Replace every placeholder with your specific numbers, methods, and outcomes. The copy-paste trap: borrowed language without personal data creates generic resumes that fail both ATS and recruiter scanning. The adaptation test: after filling a formula, can you explain every number in an interview? If any answer is no, the adaptation is incomplete. For the complete step-by-step process on navigating, selecting, and adapting formulas without falling into copy-paste traps, see our Professional Impact Dictionary usage guide.
The Process:
- Open your current resume
- Pick one bullet that feels weak or generic
- Identify the verb. If it's "Helped" or "Responsible for" → replace it using the tables above
- Add context: What was the constraint? Team size? Budget? Timeline? System complexity?
- Add a metric: $, %, hours, volume, error rate, adoption, retention
- Ask: "So what?" Connect the metric to a business outcome (revenue, cost, speed, quality, risk)
If you can't do steps 4–6, the bullet is fluff. Delete it or dig deeper until you find the real story.
Why Most Resumes Fail the "Gold Filter"
Here's the uncomfortable truth: 95% of resumes are full of air.
They list duties, not results. They say "Managed projects" without saying what the projects achieved. They claim "Strong communication skills" without proving it through outcomes.
The Gold Filter is simple:
Can this sentence be turned into a quantifiable bullet point?
- YES → Keep it. Polish it. Make it dense.
- NO → Delete it immediately.
Example of Fluff:
"Passionate marketing professional with strong analytical skills and a proven track record of success."
What's wrong? Zero facts. Zero proof. Zero weight.
Example of Gold:
"Marketing Manager with 7 years leading $1M+ ad budgets. Scaled user base from 10K to 100K in 12 months via SEO + PPC. Reduced CAC by 30%."
What's right? Three facts. Three metrics. Three proof points.
The difference between these two candidates is not skill—it's translation. The first one didn't bother to translate their work into the language employers buy.
How to Apply the Formula (Role-Specific Proof)
The formula works for every role. The difference is in the metrics and constraints you choose.
Example: Software Engineer
Bad: "Built feature X"
Formula Applied: "Built feature X that reduced latency by 40%, enabling 2M more daily transactions and $500K additional revenue"
Translation:
- Verb: Built
- Context: Feature X (latency problem)
- Metric: 40% reduction, 2M transactions
- Outcome: $500K revenue
See Software Engineer Resume Guide for technical-to-business translation.
Example: Product Manager
Bad: "Managed product roadmap"
Formula Applied: "Led roadmap for 3 products, shipping 12 features in 6 months that increased user retention by 18% and added $2M ARR"
Translation:
- Verb: Led
- Context: 3 products, 12 features, 6 months
- Metric: 18% retention, $2M ARR
- Outcome: Revenue growth
See Product Manager, Project Manager, Business Analyst, Scrum Master, Construction PM. For PM-specific ATS keywords covering roadmapping, stakeholder management, and Agile methodologies, see our Product Manager keywords guide.
Example: Marketing Manager
Bad: "Ran marketing campaigns"
Formula Applied: "Launched 5 campaigns across SEO + PPC that reduced CAC from $180 to $125 and generated $1.8M pipeline"
Translation:
- Verb: Launched
- Context: 5 campaigns, SEO + PPC
- Metric: CAC -30%, $1.8M pipeline
- Outcome: Cost efficiency + revenue
See Marketing Manager, Marketing ROI, Social Media Manager, SEO Specialist, Content Manager, Copywriter, Brand Manager, PR Specialist.
Example: Sales Representative
Bad: "Exceeded quota"
Formula Applied: "Closed $1.2M ARR across 18 enterprise accounts, exceeding quota by 135% and improving win rate from 14% to 21%"
Translation:
- Verb: Closed
- Context: 18 enterprise accounts
- Metric: $1.2M ARR, 135% quota, 21% win rate
- Outcome: Revenue + efficiency
See Sales Representative, Recruiter.
The Pattern Across All Roles
Every role has different metrics, but the formula stays the same:
-
Engineering: Latency, uptime, cost, velocity → Full Stack, Backend, Frontend, DevOps, Cloud Architect, Cybersecurity, QA, Systems Admin, Network Engineer, IT Support, Prompt Engineer, AI Engineer, ML Engineer, Data Scientist. For ATS-optimized keyword lists by specialty: Software Engineer keywords, Frontend keywords, Backend keywords, Full Stack keywords, DevOps keywords, Cloud Architect keywords
-
Finance: Close cycle, variance, forecast accuracy → Accountant, Financial Analyst, Data Analyst. For data role keyword optimization, see Data Analyst keywords and Data Scientist keywords
-
Healthcare: Patient outcomes, safety, compliance → Registered Nurse, Nursing Clinical, Medical Assistant, Pharmacy Technician
-
Creative: Campaign performance, conversion, engagement → Graphic Designer, Portfolio Guide, UX/UI Designer, Design Metrics, Technical Writer. For design keyword optimization: UX Designer keywords
-
Admin/Support: Time saved, efficiency, CSAT → Administrative Assistant, Executive Assistant, Customer Service, Retail Store Manager
-
Education/Research: Learning outcomes, test scores → Teacher, Research Assistant
-
HR & People Operations: Hiring speed, retention → HR & People Metrics, HR Manager
-
Supply Chain \u0026 Logistics: Flow efficiency, cost management → Supply Chain \u0026 Logistics Metrics
-
Legal: Risk mitigation → Paralegal
-
Trades: Safety, quality, project completion → Electrician, Mechanical Engineer, Real Estate Agent
-
Operations: Throughput, cost efficiency → Operations Manager, Supply Chain Manager, Logistics
-
Industry-Specific: Consulting, Education, Energy, Finance, Government, Healthcare, Hospitality, Insurance, Legal, Manufacturing, Non-Profit, Real Estate, Retail, Event Planning, Remote Work, Tech Startup
The formula doesn't change. Only the metrics do.
For forbidden words and power verbs: 5 Words Killing Your Resume, Resume Action Verbs & Power Words, Prompt Engineer Frameworks, AI Engineer Deployment Spec.
Final Thoughts
This dictionary will be updated continuously. New roles, new industries, new metrics. Bookmark it. Return when you need to upgrade weak bullets into gold.
Your resume is not a biography. It is a sales document. Every line either sells you or fails you. There is no middle ground.
Use this dictionary to filter out fluff and keep only the gold that gets you hired.