Pergunta de pesquisa

O primeiro passo para fazer uma pesquisa é refletir sobre a pergunta de pesquisa! A pergunta deve ser clara e objetiva. Para isso separe a pergunta do tema de pesquisa. Por exemplo: Tema - Práticas…

Smartphone

独家优惠奖金 100% 高达 1 BTC + 180 免费旋转




Kickstart Your Engineering Book Club

Engineers are notoriously resourceful and Lob’s core services team is no exception. When faced with a problem (in this case, a failing service due to a database issue) the team put on their deerstalker hats and began to investigate.

Our first topic to zero in on was PostgreSQL. The outcome was to increase our confidence around how our database works and foster discussion and ideas around design decisions for our architecture with an eye on database optimization.

We thought we’d share the format of our sessions (thank you, Engineering Manager, Ross Martin) along with a specific example so you get the idea. We found this template has worked well to guide our discussions (read: keep us from getting distracted). We created a template in Notion to repurpose for each session; this also serves as our repository to capture learnings.

As Dr. Stephen Covey suggests, “Begin with the end in mind.” As we were having performance issues for specific types of queries in large tables (which were causing requests to time out) we identified two goals up front: 1) Learn about the different indexes provided by Postgres and be able to explain them, and 2) Understand their benefits and provide guidance on when to use a specific index.

Prompts may differ based on goals or content but a few basic ones are helpful to guide the discussion. We like to rotate facilitators across sessions to give each member of the team a chance to level up.

Indexes are used to either ensure data consistency OR improve read access.

TID (Tuple Identifier) — A TID is a pair (block number, tuple index within that block) that identifies the physical location of the row within a table. i.e. (10, 5) is the 5th item of the 10th block of a table or index.

Constraints such as UNIQUE, PRIMARY KEY or EXCLUDE USING are only possible to implement in PostgreSQL with a backing index. (The unique constraints require searching for the specific element and an index prevents the need for a sequential search.)

An index cannot alter the result of a query. An index only provides another access method to the data.

As a consequence, each index adds write costs to your DML queries: insert, update and delete now have to maintain the indexes too, and in a transactional way.

Running EXPLAIN on common queries would be a great way to see what indexes are currently in use, and if they can be made better.

We encouraged the team to capture, share, and review Github gists or repos as applicable. We also share links to similar or tangentially-related content we discover. (Are you really a developer if you haven’t combed StackOverflow, documentation, GitHub issues, and so forth looking for more answers or examples?)

Our efforts to level up our db knowledge had an immediate impact. But in order to provide examples, it’s helpful to see what else we covered in our subsequent sessions around Postgres. We spent the next couple of meetups covering other areas of Postgres such as data types and internals; we included the specific reading recommendations in case you wanted to dive in too.

In addition to making us feel like geniuses, the lessons learned from absorbing and discussing this content as a team helped were the driving force behind several Postgres database optimizations.

For example, initially, we assumed that if we limited the data range a customer could export from our (user-facing) dashboard we would improve performance. But in testing we discovered this was not true. After our first session, we were much more comfortable with the details of Postgres indexing (and how it applied to our specific data and requirements). In our case, if we removed range constraints it would use the correct index; on top of that, by adding inflection to a query we were able to ensure it always used the right index. This helped empower customers to take better control of their data exports (also freeing up CX time and resources) and resulted in a faster and more responsive API and dashboard- these updates have yielded on average 15x better performance (!).

Add a comment

Related posts:

Community Congrats! Celebrate! 2018

In this special edition of Community Congrats! we’re highlighting those teachers, researchers, staff, and students who have gone above and beyond to earn a special place on the U of A’s 2018…

Introduction

Blinking of the eye is an involuntary act of the body. It means the body does it automatically. However,you can also blink your eyes whenever you want to. Eye blinking varies with the ages. A newborn…

How to get better Lifetime for Solar panels

Solar energy always improving, and with these developments comes amplified proficiency and durability. Like any usage or utility, solar panels have a limited life cycle. They usually see 25+ years of…