pax_global_header00006660000000000000000000000064135111461160014511gustar00rootroot0000000000000052 comment=755b93c8f27b76cc3dd3418a5a48d2118bf030e1 ruby-batch-loader-1.4.1+dfsg.1/000077500000000000000000000000001351114611600160765ustar00rootroot00000000000000ruby-batch-loader-1.4.1+dfsg.1/.gitignore000066400000000000000000000002151351114611600200640ustar00rootroot00000000000000/.bundle/ /.yardoc /Gemfile.lock /_yardoc/ /coverage/ /doc/ /pkg/ /spec/reports/ /tmp/ # rspec failure tracking .rspec_status .ruby-version ruby-batch-loader-1.4.1+dfsg.1/.rspec000066400000000000000000000000371351114611600172130ustar00rootroot00000000000000--format documentation --color ruby-batch-loader-1.4.1+dfsg.1/.travis.yml000066400000000000000000000012341351114611600202070ustar00rootroot00000000000000sudo: false language: ruby before_install: gem install bundler -v 1.17.1 matrix: include: - gemfile: graphql-1.7.gemfile env: GRAPHQL_RUBY_VERSION=1_7 CI=true rvm: 2.3.8 - gemfile: graphql-1.8.gemfile env: GRAPHQL_RUBY_VERSION=1_8 CI=true rvm: 2.3.8 - gemfile: graphql-1.7.gemfile env: GRAPHQL_RUBY_VERSION=1_7 CI=true rvm: 2.4.5 - gemfile: graphql-1.8.gemfile env: GRAPHQL_RUBY_VERSION=1_8 CI=true rvm: 2.4.5 - gemfile: graphql-1.7.gemfile env: GRAPHQL_RUBY_VERSION=1_7 CI=true rvm: 2.5.3 - gemfile: graphql-1.8.gemfile env: GRAPHQL_RUBY_VERSION=1_8 CI=true rvm: 2.5.3 ruby-batch-loader-1.4.1+dfsg.1/CHANGELOG.md000066400000000000000000000135441351114611600177160ustar00rootroot00000000000000# Changelog The following are lists of the notable changes included with each release. This is intended to help keep people informed about notable changes between versions, as well as provide a rough history. Each item is prefixed with one of the following labels: `Added`, `Changed`, `Deprecated`, `Removed`, `Fixed`, `Security`. We also use [Semantic Versioning](http://semver.org) to manage the versions of this gem so that you can set version constraints properly. #### [Unreleased](https://github.com/exAspArk/batch-loader/compare/v1.4.1...HEAD) * WIP #### [v1.4.1](https://github.com/exAspArk/batch-loader/compare/v1.4.0...v1.4.1) * `Fixes`: Does not allow mutating and corrupting a list of items in a `batch` block. [#46](https://github.com/exAspArk/batch-loader/pull/46) #### [v1.4.0](https://github.com/exAspArk/batch-loader/compare/v1.3.0...v1.4.0) * `Added`: new `replace_methods` argument to `BatchLoader#batch` to allow control over `define_method` calls. [#45](https://github.com/exAspArk/batch-loader/pull/45) #### [v1.3.0](https://github.com/exAspArk/batch-loader/compare/v1.2.2...v1.3.0) * `Added`: `BatchLoader::GraphQL.for` to make it compatible with `graphql` gem versions `>= 1.8.7`. [#30](https://github.com/exAspArk/batch-loader/issues/30) #### [v1.2.2](https://github.com/exAspArk/batch-loader/compare/v1.2.1...v1.2.2) * `Fixed`: Identify item by `key` object instead of `key` string representation. [#27](https://github.com/exAspArk/batch-loader/pull/27) #### [v1.2.1](https://github.com/exAspArk/batch-loader/compare/v1.2.0...v1.2.1) * `Fixed`: Do not depend on `method_missing` for `respond_to?`. [#14](https://github.com/exAspArk/batch-loader/pull/14) #### [v1.2.0](https://github.com/exAspArk/batch-loader/compare/v1.1.1...v1.2.0) * `Added`: `key` argument for the `BatchLoader#batch` method. [#12](https://github.com/exAspArk/batch-loader/pull/12) #### [v1.1.1](https://github.com/exAspArk/batch-loader/compare/v1.1.0...v1.1.1) * `Fixed`: `loader`, made it thread-safe again. [#10](https://github.com/exAspArk/batch-loader/pull/10) #### [v1.1.0](https://github.com/exAspArk/batch-loader/compare/v1.0.4...v1.1.0) * `Added`: `default_value` override option. [#8](https://github.com/exAspArk/batch-loader/pull/8) * `Added`: `loader.call {}` block syntax, for memoizing repeat calls to the same item. [#8](https://github.com/exAspArk/batch-loader/pull/8) #### [v1.0.4](https://github.com/exAspArk/batch-loader/compare/v1.0.3...v1.0.4) * `Fixed`: Fix arity bug in `respond_to?` [#3](https://github.com/exAspArk/batch-loader/pull/3) #### [v1.0.3](https://github.com/exAspArk/batch-loader/compare/v1.0.2...v1.0.3) – 2017-09-18 * `Fixed`: auto syncing performance up to 30x times compared to [v1.0.2](https://github.com/exAspArk/batch-loader/blob/master/CHANGELOG.md#v102--2017-09-14). Ruby `Forwardable` with `def_delegators` is too slow. * `Fixed`: GraphQL performance up to 3x times by disabling auto syncing in favor of syncing with [graphql-ruby](https://github.com/rmosolgo/graphql-ruby) `lazy_resolve`. * `Added`: more benchmarks. #### [v1.0.2](https://github.com/exAspArk/batch-loader/compare/v1.0.1...v1.0.2) – 2017-09-14 * `Added`: `BatchLoader#inspect` method because of Pry, which [swallows errors](https://github.com/pry/pry/issues/1642). ```ruby # Before: require 'pry' binding.pry pry(main)> result = BatchLoader.for(1).batch { |ids, loader| raise "Oops" }; pry(main)> result # Pry called result.inspect and swallowed the "Oops" error # => # pry(main)> result.id # => NoMethodError: undefined method `id' for nil:NilClass ``` ```ruby # After: require 'pry' binding.pry pry(main)> result = BatchLoader.for(1).batch { |ids, loader| raise "Oops" }; pry(main)> result # => # pry(main)> result.id # => RuntimeError: Oops ``` * `Added`: benchmarks. * `Fixed`: caching `nil`s for not loaded values only after successful `#batch` execution. * `Changed`: internal implementation with Ruby `Forwardable`, don't delegate methods like `object_id` and `__send__`. #### [v1.0.1](https://github.com/exAspArk/batch-loader/compare/v1.0.0...v1.0.1) – 2017-09-03 * `Fixed`: loading `BatchLoader` by requiring `Set`. #### [v1.0.0](https://github.com/exAspArk/batch-loader/compare/v0.3.0...v1.0.0) – 2017-08-21 * `Removed`: `BatchLoader.sync!` and `BatchLoader#sync`. Now syncing is done implicitly when you call any method on the lazy object. ```ruby def load_user(user_id) BatchLoader.for(user_id).batch { ... } end # Before: users = [load_user(1), load_user(2), load_user(3)] puts BatchLoader.sync!(users) # or users.map!(&:sync) ``` ```ruby # After: users = [load_user(1), load_user(2), load_user(3)] puts users ``` * `Removed`: `BatchLoader#load`. Use `loader` lambda instead: ```ruby # Before: BatchLoader.for(user_id).batch do |user_ids, batch_loader| user_ids.each { |user_id| batch_loader.load(user_id, user_id) } end ``` ```ruby # After: BatchLoader.for(user_id).batch do |user_ids, loader| user_ids.each { |user_id| loader.call(user_id, user_id) } end ``` * `Changed`: use `BatchLoader::GraphQL` in GraphQL schema: ```ruby # Before: Schema = GraphQL::Schema.define do # ... lazy_resolve BatchLoader, :sync end ``` ```ruby # After: Schema = GraphQL::Schema.define do # ... use BatchLoader::GraphQL end ``` #### [v0.3.0](https://github.com/exAspArk/batch-loader/compare/v0.2.0...v0.3.0) – 2017-08-03 * `Added`: `BatchLoader::Executor.clear_current` to clear cache manually. * `Added`: tests and description how to use with GraphQL. #### [v0.2.0](https://github.com/exAspArk/batch-loader/compare/v0.1.0...v0.2.0) – 2017-08-02 * `Added`: `cache: false` option to disable caching for resolved values. * `Added`: `BatchLoader::Middleware` to clear cache between Rack requests. * `Added`: more docs and tests. #### [v0.1.0](https://github.com/exAspArk/batch-loader/compare/ed32edb...v0.1.0) – 2017-07-31 * `Added`: initial functional version. ruby-batch-loader-1.4.1+dfsg.1/CODE_OF_CONDUCT.md000066400000000000000000000062321351114611600207000ustar00rootroot00000000000000# Contributor Covenant Code of Conduct ## Our Pledge In the interest of fostering an open and welcoming environment, we as contributors and maintainers pledge to making participation in our project and our community a harassment-free experience for everyone, regardless of age, body size, disability, ethnicity, gender identity and expression, level of experience, nationality, personal appearance, race, religion, or sexual identity and orientation. ## Our Standards Examples of behavior that contributes to creating a positive environment include: * Using welcoming and inclusive language * Being respectful of differing viewpoints and experiences * Gracefully accepting constructive criticism * Focusing on what is best for the community * Showing empathy towards other community members Examples of unacceptable behavior by participants include: * The use of sexualized language or imagery and unwelcome sexual attention or advances * Trolling, insulting/derogatory comments, and personal or political attacks * Public or private harassment * Publishing others' private information, such as a physical or electronic address, without explicit permission * Other conduct which could reasonably be considered inappropriate in a professional setting ## Our Responsibilities Project maintainers are responsible for clarifying the standards of acceptable behavior and are expected to take appropriate and fair corrective action in response to any instances of unacceptable behavior. Project maintainers have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, or to ban temporarily or permanently any contributor for other behaviors that they deem inappropriate, threatening, offensive, or harmful. ## Scope This Code of Conduct applies both within project spaces and in public spaces when an individual is representing the project or its community. Examples of representing a project or community include using an official project e-mail address, posting via an official social media account, or acting as an appointed representative at an online or offline event. Representation of a project may be further defined and clarified by project maintainers. ## Enforcement Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting the project team at exaspark@gmail.com. All complaints will be reviewed and investigated and will result in a response that is deemed necessary and appropriate to the circumstances. The project team is obligated to maintain confidentiality with regard to the reporter of an incident. Further details of specific enforcement policies may be posted separately. Project maintainers who do not follow or enforce the Code of Conduct in good faith may face temporary or permanent repercussions as determined by other members of the project's leadership. ## Attribution This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4, available at [http://contributor-covenant.org/version/1/4][version] [homepage]: http://contributor-covenant.org [version]: http://contributor-covenant.org/version/1/4/ ruby-batch-loader-1.4.1+dfsg.1/Gemfile000066400000000000000000000002021351114611600173630ustar00rootroot00000000000000source "https://rubygems.org" gem 'coveralls', require: false # Specify your gem's dependencies in batch-loader.gemspec gemspec ruby-batch-loader-1.4.1+dfsg.1/LICENSE.txt000066400000000000000000000020631351114611600177220ustar00rootroot00000000000000The MIT License (MIT) Copyright (c) 2017 exAspArk Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ruby-batch-loader-1.4.1+dfsg.1/README.md000066400000000000000000000470411351114611600173630ustar00rootroot00000000000000# BatchLoader [![Build Status](https://travis-ci.org/exAspArk/batch-loader.svg?branch=master)](https://travis-ci.org/exAspArk/batch-loader) [![Coverage Status](https://coveralls.io/repos/github/exAspArk/batch-loader/badge.svg)](https://coveralls.io/github/exAspArk/batch-loader) [![Code Climate](https://img.shields.io/codeclimate/maintainability/exAspArk/batch-loader.svg)](https://codeclimate.com/github/exAspArk/batch-loader/maintainability) [![Downloads](https://img.shields.io/gem/dt/batch-loader.svg)](https://rubygems.org/gems/batch-loader) [![Latest Version](https://img.shields.io/gem/v/batch-loader.svg)](https://rubygems.org/gems/batch-loader) This gem provides a generic lazy batching mechanism to avoid N+1 DB queries, HTTP queries, etc. Developers from these companies use `BatchLoader`: GitLab Netflix Alibaba Universe Wealthsimple Decidim ## Contents * [Highlights](#highlights) * [Usage](#usage) * [Why?](#why) * [Basic example](#basic-example) * [How it works](#how-it-works) * [RESTful API example](#restful-api-example) * [GraphQL example](#graphql-example) * [Loading multiple items](#loading-multiple-items) * [Batch key](#batch-key) * [Caching](#caching) * [Replacing methods](#replacing-methods) * [Installation](#installation) * [API](#api) * [Implementation details](#implementation-details) * [Development](#development) * [Related gems](#related-gems) * [Contributing](#contributing) * [Alternatives](#alternatives) * [License](#license) * [Code of Conduct](#code-of-conduct) ## Highlights * Generic utility to avoid N+1 DB queries, HTTP requests, etc. * Adapted Ruby implementation of battle-tested tools like [Haskell Haxl](https://github.com/facebook/Haxl), [JS DataLoader](https://github.com/facebook/dataloader), etc. * Batching is isolated and lazy, load data in batch where and when it's needed. * Automatically caches previous queries (identity map). * Thread-safe (`loader`). * No need to share batching through variables or custom defined classes. * No dependencies, no monkey-patches, no extra primitives such as Promises. ## Usage ### Why? Let's have a look at the code with N+1 queries: ```ruby def load_posts(ids) Post.where(id: ids) end posts = load_posts([1, 2, 3]) # Posts SELECT * FROM posts WHERE id IN (1, 2, 3) # _ ↓ _ # ↙ ↓ ↘ users = posts.map do |post| # U ↓ ↓ SELECT * FROM users WHERE id = 1 post.user # ↓ U ↓ SELECT * FROM users WHERE id = 2 end # ↓ ↓ U SELECT * FROM users WHERE id = 3 # ↘ ↓ ↙ # ¯ ↓ ¯ puts users # Users ``` The naive approach would be to preload dependent objects on the top level: ```ruby # With ORM in basic cases def load_posts(ids) Post.where(id: ids).includes(:user) end # But without ORM or in more complicated cases you will have to do something like: def load_posts(ids) # load posts posts = Post.where(id: ids) user_ids = posts.map(&:user_id) # load users users = User.where(id: user_ids) user_by_id = users.each_with_object({}) { |user, memo| memo[user.id] = user } # map user to post posts.each { |post| post.user = user_by_id[post.user_id] } end posts = load_posts([1, 2, 3]) # Posts SELECT * FROM posts WHERE id IN (1, 2, 3) # _ ↓ _ SELECT * FROM users WHERE id IN (1, 2, 3) # ↙ ↓ ↘ users = posts.map do |post| # U ↓ ↓ post.user # ↓ U ↓ end # ↓ ↓ U # ↘ ↓ ↙ # ¯ ↓ ¯ puts users # Users ``` But the problem here is that `load_posts` now depends on the child association and knows that it has to preload data for future use. And it'll do it every time, even if it's not necessary. Can we do better? Sure! ### Basic example With `BatchLoader` we can rewrite the code above: ```ruby def load_posts(ids) Post.where(id: ids) end def load_user(post) BatchLoader.for(post.user_id).batch do |user_ids, loader| User.where(id: user_ids).each { |user| loader.call(user.id, user) } end end posts = load_posts([1, 2, 3]) # Posts SELECT * FROM posts WHERE id IN (1, 2, 3) # _ ↓ _ # ↙ ↓ ↘ users = posts.map do |post| # BL ↓ ↓ load_user(post) # ↓ BL ↓ end # ↓ ↓ BL # ↘ ↓ ↙ # ¯ ↓ ¯ puts users # Users SELECT * FROM users WHERE id IN (1, 2, 3) ``` As we can see, batching is isolated and described right in a place where it's needed. ### How it works In general, `BatchLoader` returns a lazy object. Each lazy object knows which data it needs to load and how to batch the query. As soon as you need to use the lazy objects, they will be automatically loaded once without N+1 queries. So, when we call `BatchLoader.for` we pass an item (`user_id`) which should be collected and used for batching later. For the `batch` method, we pass a block which will use all the collected items (`user_ids`):
BatchLoader.for(post.user_id).batch do |user_ids, loader|
  ...
end
Inside the block we execute a batch query for our items (`User.where`). After that, all we have to do is to call `loader` by passing an item which was used in `BatchLoader.for` method (`user_id`) and the loaded object itself (`user`):
BatchLoader.for(post.user_id).batch do |user_ids, loader|
  User.where(id: user_ids).each { |user| loader.call(user.id, user) }
end
When we call any method on the lazy object, it'll be automatically loaded through batching for all instantiated `BatchLoader`s:
puts users # => SELECT * FROM users WHERE id IN (1, 2, 3)
For more information, see the [Implementation details](#implementation-details) section. ### RESTful API example Now imagine we have a regular Rails app with N+1 HTTP requests: ```ruby # app/models/post.rb class Post < ApplicationRecord def rating HttpClient.request(:get, "https://example.com/ratings/#{id}") end end # app/controllers/posts_controller.rb class PostsController < ApplicationController def index posts = Post.limit(10) serialized_posts = posts.map { |post| {id: post.id, rating: post.rating} } # N+1 HTTP requests for each post.rating render json: serialized_posts end end ``` As we can see, the code above will make N+1 HTTP requests, one for each post. Let's batch the requests with a gem called [parallel](https://github.com/grosser/parallel): ```ruby class Post < ApplicationRecord def rating_lazy BatchLoader.for(post).batch do |posts, loader| Parallel.each(posts, in_threads: 10) { |post| loader.call(post, post.rating) } end end # ... end ``` `loader` is thread-safe. So, if `HttpClient` is also thread-safe, then with `parallel` gem we can execute all HTTP requests concurrently in threads (there are some benchmarks for [concurrent HTTP requests](https://github.com/exAspArk/concurrent_http_requests) in Ruby). Thanks to Matz, MRI releases GIL when thread hits blocking I/O – HTTP request in our case. In the controller, all we have to do is to replace `post.rating` with the lazy `post.rating_lazy`: ```ruby class PostsController < ApplicationController def index posts = Post.limit(10) serialized_posts = posts.map { |post| {id: post.id, rating: post.rating_lazy} } render json: serialized_posts end end ``` `BatchLoader` caches the loaded values. To ensure that the cache is purged between requests in the app add the following middleware to your `config/application.rb`: ```ruby config.middleware.use BatchLoader::Middleware ``` See the [Caching](#caching) section for more information. ### GraphQL example Batching is particularly useful with GraphQL. Using such techniques as preloading data in advance to avoid N+1 queries can be very complicated, since a user can ask for any available fields in a query. Let's take a look at the simple [graphql-ruby](https://github.com/rmosolgo/graphql-ruby) schema example: ```ruby Schema = GraphQL::Schema.define do query QueryType end QueryType = GraphQL::ObjectType.define do name "Query" field :posts, !types[PostType], resolve: ->(obj, args, ctx) { Post.all } end PostType = GraphQL::ObjectType.define do name "Post" field :user, !UserType, resolve: ->(post, args, ctx) { post.user } # N+1 queries end UserType = GraphQL::ObjectType.define do name "User" field :name, !types.String end ``` If we want to execute a simple query like the following, we will get N+1 queries for each `post.user`: ```ruby query = " { posts { user { name } } } " Schema.execute(query) ``` To avoid this problem, all we have to do is to change the resolver to return `BatchLoader::GraphQL` ([#32](https://github.com/exAspArk/batch-loader/pull/32) explains why not just `BatchLoader`): ```ruby PostType = GraphQL::ObjectType.define do name "Post" field :user, !UserType, resolve: ->(post, args, ctx) do BatchLoader::GraphQL.for(post.user_id).batch do |user_ids, loader| User.where(id: user_ids).each { |user| loader.call(user.id, user) } end end end ``` And setup GraphQL to use the built-in `lazy_resolve` method: ```ruby Schema = GraphQL::Schema.define do query QueryType use BatchLoader::GraphQL end ``` That's it. ### Loading multiple items For batches where there is no item in response to a call, we normally return `nil`. However, you can use `:default_value` to return something else instead: ```ruby BatchLoader.for(post.user_id).batch(default_value: NullUser.new) do |user_ids, loader| User.where(id: user_ids).each { |user| loader.call(user.id, user) } end ``` For batches where the value is some kind of collection, such as an Array or Hash, `loader` also supports being called with a block, which yields the _current_ value, and returns the _next_ value. This is extremely useful for 1:Many relationships: ```ruby BatchLoader.for(user.id).batch(default_value: []) do |user_ids, loader| Comment.where(user_id: user_ids).each do |comment| loader.call(comment.user_id) { |memo| memo << comment } end end ``` ### Batch key It's possible to reuse the same `BatchLoader#batch` block for loading different types of data by specifying a unique `key`. For example, with polymorphic associations: ```ruby def lazy_association(post) id = post.association_id key = post.association_type BatchLoader.for(id).batch(key: key) do |ids, loader, args| model = Object.const_get(args[:key]) model.where(id: ids).each { |record| loader.call(record.id, record) } end end post1 = Post.save(association_id: 1, association_type: 'Tag') post2 = Post.save(association_id: 1, association_type: 'Category') lazy_association(post1) # SELECT * FROM tags WHERE id IN (1) lazy_association(post2) # SELECT * FROM categories WHERE id IN (1) ``` It's also required to pass custom `key` when using `BatchLoader` with metaprogramming (e.g. `eval`). ### Caching By default `BatchLoader` caches the loaded values. You can test it by running something like: ```ruby def user_lazy(id) BatchLoader.for(id).batch do |ids, loader| User.where(id: ids).each { |user| loader.call(user.id, user) } end end puts user_lazy(1) # SELECT * FROM users WHERE id IN (1) # => <#User:...> puts user_lazy(1) # no request # => <#User:...> ``` Usually, it's just enough to clear the cache between HTTP requests in the app. To do so, simply add the middleware: ```ruby use BatchLoader::Middleware ``` To drop the cache manually you can run: ```ruby puts user_lazy(1) # SELECT * FROM users WHERE id IN (1) puts user_lazy(1) # no request BatchLoader::Executor.clear_current puts user_lazy(1) # SELECT * FROM users WHERE id IN (1) ``` In some rare cases it's useful to disable caching for `BatchLoader`. For example, in tests or after data mutations: ```ruby def user_lazy(id) BatchLoader.for(id).batch(cache: false) do |ids, loader| # ... end end puts user_lazy(1) # SELECT * FROM users WHERE id IN (1) puts user_lazy(1) # SELECT * FROM users WHERE id IN (1) ``` If you set `cache: false`, it's likely you also want `replace_methods: false` (see below section). ### Replacing methods By default, `BatchLoader` replaces methods on its instance by calling `#define_method` after batching to copy methods from the loaded value. This consumes some time but allows to speed up any future method calls on the instance. In some cases, when there are a lot of instances with a huge number of defined methods, this initial process of replacing the methods can be slow. You may consider avoiding the "up front payment" and "pay as you go" with `#method_missing` by disabling the method replacement: ```ruby BatchLoader.for(id).batch(replace_methods: false) do |ids, loader| # ... end ``` ## Installation Add this line to your application's Gemfile: ```ruby gem 'batch-loader' ``` And then execute: $ bundle Or install it yourself as: $ gem install batch-loader ## API ```ruby BatchLoader.for(item).batch( default_value: default_value, cache: cache, replace_methods: replace_methods, key: key ) do |items, loader, args| # ... end ``` | Argument Key | Default | Description | | --------------- | --------------------------------------------- | ------------------------------------------------------------- | | `item` | - | Item which will be collected and used for batching. | | `default_value` | `nil` | Value returned by default after batching. | | `cache` | `true` | Set `false` to disable caching between the same executions. | | `replace_methods` | `true` | Set `false` to use `#method_missing` instead of replacing the methods after batching. | | `key` | `nil` | Pass custom key to uniquely identify the batch block. | | `items` | - | List of collected items for batching. | | `loader` | - | Lambda which should be called to load values loaded in batch. | | `args` | `{default_value: nil, cache: true, replace_methods: true, key: nil}` | Arguments passed to the `batch` method. | ## Implementation details See the [slides](https://speakerdeck.com/exaspark/batching-a-powerful-way-to-solve-n-plus-1-queries) [37-42]. ## Development After checking out the repo, run `bin/setup` to install dependencies. Then, run `rake spec` to run the tests. You can also run `bin/console` for an interactive prompt that will allow you to experiment. To install this gem onto your local machine, run `bundle exec rake install`. To release a new version, update the version number in `version.rb`, and then run `bundle exec rake release`, which will create a git tag for the version, push git commits and tags, and push the `.gem` file to [rubygems.org](https://rubygems.org). ## Related gems These gems are built by using `BatchLoader`: * [decidim-core](https://github.com/decidim/decidim/) – participatory democracy framework made with Ruby on Rails. * [ams_lazy_relationships](https://github.com/Bajena/ams_lazy_relationships/) – ActiveModel Serializers add-on for eliminating N+1 queries. * [batch-loader-active-record](https://github.com/mathieul/batch-loader-active-record/) – ActiveRecord lazy association generator to avoid N+1 DB queries. ## Contributing Bug reports and pull requests are welcome on GitHub at https://github.com/exAspArk/batch-loader. This project is intended to be a safe, welcoming space for collaboration, and contributors are expected to adhere to the [Contributor Covenant](http://contributor-covenant.org) code of conduct. ## Alternatives There are some other Ruby implementations for batching such as: * [shopify/graphql-batch](https://github.com/shopify/graphql-batch) * [sheerun/dataloader](https://github.com/sheerun/dataloader) However, `batch-loader` has some differences: * It is implemented for general usage and can be used not only with GraphQL. In fact, we use it for RESTful APIs and GraphQL on production at the same time. * It doesn't try to mimic implementations in other programming languages which have an asynchronous nature. So, it doesn't load extra dependencies to bring such primitives as Promises, which are not very popular in Ruby community. Instead, it uses the idea of lazy objects, which are included in the [Ruby standard library](https://ruby-doc.org/core-2.4.1/Enumerable.html#method-i-lazy). These lazy objects allow one to return the necessary data at the end when it's necessary. * It doesn't force you to share batching through variables or custom defined classes, just pass a block to the `batch` method. * It doesn't require to return an array of the loaded objects in the same order as the passed items. I find it difficult to satisfy these constraints: to sort the loaded objects and add `nil` values for the missing ones. Instead, it provides the `loader` lambda which simply maps an item to the loaded object. * It doesn't depend on any other external dependencies. For example, no need to load huge external libraries for thread-safety, the gem is thread-safe out of the box. ## License The gem is available as open source under the terms of the [MIT License](http://opensource.org/licenses/MIT). ## Code of Conduct Everyone interacting in the Batch::Loader project’s codebases, issue trackers, chat rooms and mailing lists is expected to follow the [code of conduct](https://github.com/exAspArk/batch-loader/blob/master/CODE_OF_CONDUCT.md). ruby-batch-loader-1.4.1+dfsg.1/Rakefile000066400000000000000000000001651351114611600175450ustar00rootroot00000000000000require "bundler/gem_tasks" require "rspec/core/rake_task" RSpec::Core::RakeTask.new(:spec) task :default => :spec ruby-batch-loader-1.4.1+dfsg.1/batch-loader.gemspec000066400000000000000000000022521351114611600217710ustar00rootroot00000000000000# coding: utf-8 require_relative "./lib/batch_loader/version" Gem::Specification.new do |spec| spec.name = "batch-loader" spec.version = BatchLoader::VERSION spec.authors = ["exAspArk"] spec.email = ["exaspark@gmail.com"] spec.summary = %q{Powerful tool to avoid N+1 DB or HTTP queries} spec.description = %q{Powerful tool to avoid N+1 DB or HTTP queries} spec.homepage = "https://github.com/exAspArk/batch-loader" spec.license = "MIT" spec.files = `git ls-files -z`.split("\x0").reject do |f| f.match(%r{^(spec|images)/}) end spec.bindir = "exe" spec.executables = spec.files.grep(%r{^exe/}) { |f| File.basename(f) } spec.require_paths = ["lib"] spec.required_ruby_version = '>= 2.1.0' # keyword args spec.add_development_dependency "bundler", "~> 2.0" spec.add_development_dependency "rake", "~> 10.0" spec.add_development_dependency "rspec", "~> 3.0" spec.add_development_dependency "graphql", "~> 1.6" spec.add_development_dependency "pry-byebug", "~> 3.4" spec.add_development_dependency "benchmark-ips", "~> 2.7" spec.add_development_dependency "ruby-prof", "~> 0.16" end ruby-batch-loader-1.4.1+dfsg.1/bin/000077500000000000000000000000001351114611600166465ustar00rootroot00000000000000ruby-batch-loader-1.4.1+dfsg.1/bin/console000077500000000000000000000005331351114611600202370ustar00rootroot00000000000000#!/usr/bin/env ruby require "bundler/setup" require "batch_loader" # You can add fixtures and/or initialization code here to make experimenting # with your gem easier. You can also use a different console, if you like. # (If you use this, don't forget to add pry to your Gemfile!) # require "pry" # Pry.start require "irb" IRB.start(__FILE__) ruby-batch-loader-1.4.1+dfsg.1/bin/setup000077500000000000000000000002031351114611600177270ustar00rootroot00000000000000#!/usr/bin/env bash set -euo pipefail IFS=$'\n\t' set -vx bundle install # Do any other automated setup that you need to do here ruby-batch-loader-1.4.1+dfsg.1/graphql-1.7.gemfile000066400000000000000000000001211351114611600213630ustar00rootroot00000000000000source "https://rubygems.org" gem 'coveralls' gem "graphql", "~> 1.7" gemspec ruby-batch-loader-1.4.1+dfsg.1/graphql-1.8.gemfile000066400000000000000000000001211351114611600213640ustar00rootroot00000000000000source "https://rubygems.org" gem 'coveralls' gem "graphql", "~> 1.8" gemspec ruby-batch-loader-1.4.1+dfsg.1/lib/000077500000000000000000000000001351114611600166445ustar00rootroot00000000000000ruby-batch-loader-1.4.1+dfsg.1/lib/batch-loader.rb000066400000000000000000000000421351114611600215120ustar00rootroot00000000000000require_relative "./batch_loader" ruby-batch-loader-1.4.1+dfsg.1/lib/batch_loader.rb000066400000000000000000000071651351114611600216110ustar00rootroot00000000000000# frozen_string_literal: true require "set" require_relative "./batch_loader/version" require_relative "./batch_loader/executor_proxy" require_relative "./batch_loader/middleware" require_relative "./batch_loader/graphql" class BatchLoader IMPLEMENTED_INSTANCE_METHODS = %i[object_id __id__ __send__ singleton_method_added __sync respond_to? batch inspect].freeze REPLACABLE_INSTANCE_METHODS = %i[batch inspect].freeze LEFT_INSTANCE_METHODS = (IMPLEMENTED_INSTANCE_METHODS - REPLACABLE_INSTANCE_METHODS).freeze NoBatchError = Class.new(StandardError) def self.for(item) new(item: item) end def initialize(item:, executor_proxy: nil) @item = item @__executor_proxy = executor_proxy end def batch(default_value: nil, cache: true, replace_methods: nil, key: nil, &batch_block) @default_value = default_value @cache = cache @replace_methods = replace_methods.nil? ? cache : replace_methods @key = key @batch_block = batch_block __executor_proxy.add(item: @item) __singleton_class.class_eval { undef_method(:batch) } self end def respond_to?(method_name, include_private = false) return true if LEFT_INSTANCE_METHODS.include?(method_name) __loaded_value.respond_to?(method_name, include_private) end def inspect "#" end def __sync return @loaded_value if @synced __ensure_batched @loaded_value = __executor_proxy.loaded_value(item: @item) if @cache @synced = true else __purge_cache end @loaded_value end private def __loaded_value result = __sync! @cache ? @loaded_value : result end def method_missing(method_name, *args, &block) __sync!.public_send(method_name, *args, &block) end def __sync! loaded_value = __sync if @replace_methods __replace_with!(loaded_value) else loaded_value end end def __ensure_batched return if __executor_proxy.value_loaded?(item: @item) items = __executor_proxy.list_items loader = __loader args = {default_value: @default_value, cache: @cache, replace_methods: @replace_methods, key: @key} @batch_block.call(items, loader, args) items.each do |item| next if __executor_proxy.value_loaded?(item: item) loader.call(item, @default_value) end __executor_proxy.delete(items: items) end def __loader mutex = Mutex.new -> (item, value = (no_value = true; nil), &block) do if no_value && !block raise ArgumentError, "Please pass a value or a block" elsif block && !no_value raise ArgumentError, "Please pass a value or a block, not both" end mutex.synchronize do next_value = block ? block.call(__executor_proxy.loaded_value(item: item)) : value __executor_proxy.load(item: item, value: next_value) end end end def __singleton_class class << self ; self ; end end def __replace_with!(value) __singleton_class.class_eval do (value.methods - LEFT_INSTANCE_METHODS).each do |method_name| define_method(method_name) do |*args, &block| value.public_send(method_name, *args, &block) end end end self end def __purge_cache __executor_proxy.unload_value(item: @item) __executor_proxy.add(item: @item) end def __executor_proxy @__executor_proxy ||= begin raise NoBatchError.new("Please provide a batch block first") unless @batch_block BatchLoader::ExecutorProxy.new(@default_value, @key, &@batch_block) end end (instance_methods - IMPLEMENTED_INSTANCE_METHODS).each { |method_name| undef_method(method_name) } end ruby-batch-loader-1.4.1+dfsg.1/lib/batch_loader/000077500000000000000000000000001351114611600212535ustar00rootroot00000000000000ruby-batch-loader-1.4.1+dfsg.1/lib/batch_loader/executor.rb000066400000000000000000000010421351114611600234330ustar00rootroot00000000000000# frozen_string_literal: true class BatchLoader class Executor NAMESPACE = :batch_loader def self.ensure_current Thread.current[NAMESPACE] ||= new end def self.current Thread.current[NAMESPACE] end def self.clear_current Thread.current[NAMESPACE] = nil end attr_reader :items_by_block, :loaded_values_by_block def initialize @items_by_block = Hash.new { |hash, key| hash[key] = Set.new } @loaded_values_by_block = Hash.new { |hash, key| hash[key] = {} } end end end ruby-batch-loader-1.4.1+dfsg.1/lib/batch_loader/executor_proxy.rb000066400000000000000000000021621351114611600247000ustar00rootroot00000000000000# frozen_string_literal: true require_relative "./executor" class BatchLoader class ExecutorProxy attr_reader :default_value, :block, :global_executor def initialize(default_value, key, &block) @default_value = default_value @block = block @block_hash_key = [block.source_location, key] @global_executor = BatchLoader::Executor.ensure_current end def add(item:) items_to_load << item end def list_items items_to_load.to_a.freeze end def delete(items:) global_executor.items_by_block[@block_hash_key] = items_to_load - items end def load(item:, value:) loaded[item] = value end def loaded_value(item:) if value_loaded?(item: item) loaded[item] else @default_value.dup end end def value_loaded?(item:) loaded.key?(item) end def unload_value(item:) loaded.delete(item) end private def items_to_load global_executor.items_by_block[@block_hash_key] end def loaded global_executor.loaded_values_by_block[@block_hash_key] end end end ruby-batch-loader-1.4.1+dfsg.1/lib/batch_loader/graphql.rb000066400000000000000000000021641351114611600232410ustar00rootroot00000000000000# frozen_string_literal: true class BatchLoader class GraphQL def self.use(schema_definition) schema_definition.lazy_resolve(BatchLoader::GraphQL, :sync) # for graphql gem versions <= 1.8.6 which work with BatchLoader instead of BatchLoader::GraphQL schema_definition.instrument(:field, self) end def self.instrument(type, field) old_resolve_proc = field.resolve_proc new_resolve_proc = ->(object, arguments, context) do result = old_resolve_proc.call(object, arguments, context) result.respond_to?(:__sync) ? BatchLoader::GraphQL.wrap(result) : result end field.redefine { resolve(new_resolve_proc) } end def self.wrap(batch_loader) BatchLoader::GraphQL.new.tap do |graphql| graphql.batch_loader = batch_loader end end def self.for(item) new(item) end attr_writer :batch_loader def initialize(item = nil) @batch_loader = BatchLoader.for(item) end def batch(*args, &block) @batch_loader.batch(*args, &block) self end def sync @batch_loader.__sync end end end ruby-batch-loader-1.4.1+dfsg.1/lib/batch_loader/middleware.rb000066400000000000000000000004001351114611600237070ustar00rootroot00000000000000# frozen_string_literal: true class BatchLoader class Middleware def initialize(app) @app = app end def call(env) begin @app.call(env) ensure BatchLoader::Executor.clear_current end end end end ruby-batch-loader-1.4.1+dfsg.1/lib/batch_loader/version.rb000066400000000000000000000001111351114611600232560ustar00rootroot00000000000000# frozen_string_literal: true class BatchLoader VERSION = "1.4.1" end ruby-batch-loader-1.4.1+dfsg.1/spec/000077500000000000000000000000001351114611600170305ustar00rootroot00000000000000ruby-batch-loader-1.4.1+dfsg.1/spec/batch_loader/000077500000000000000000000000001351114611600214375ustar00rootroot00000000000000ruby-batch-loader-1.4.1+dfsg.1/spec/batch_loader/middleware_spec.rb000066400000000000000000000010521351114611600251110ustar00rootroot00000000000000require "spec_helper" RSpec.describe BatchLoader::Middleware do describe '#call' do it 'returns the result from the app' do app = ->(_env) { 1 } middleware = BatchLoader::Middleware.new(app) expect(middleware.call(nil)).to eq(1) end it 'clears the Executor' do app = ->(_) { nil } middleware = BatchLoader::Middleware.new(app) BatchLoader::Executor.ensure_current expect { middleware.call(nil) }.to change { BatchLoader::Executor.current }.to(nil) end end end ruby-batch-loader-1.4.1+dfsg.1/spec/batch_loader_spec.rb000066400000000000000000000235411351114611600230030ustar00rootroot00000000000000require "spec_helper" RSpec.describe BatchLoader do after do User.destroy_all Post.destroy_all end context 'lazily' do it "syncs all BatchLoaders by returning the loaded value" do user1 = User.save(id: 1) post1 = Post.new(user_id: user1.id) user2 = User.save(id: 2) post2 = Post.new(user_id: user2.id) result = {user1: post1.user_lazy, user2: post2.user_lazy} expect(User).to receive(:where).with(id: [1, 2]).once.and_call_original expect(result).to eq(user1: user1, user2: user2) end it 'raises an error if batch was not provided' do expect { BatchLoader.for(1).id }.to raise_error(BatchLoader::NoBatchError, "Please provide a batch block first") end it 'caches the result even between different BatchLoader instances' do user = User.save(id: 1) post = Post.new(user_id: user.id) expect(User).to receive(:where).with(id: [1]).once.and_call_original expect(post.user_lazy.id).to eq(user.id) expect(post.user_lazy.id).to eq(user.id) end it 'caches the result for the same BatchLoader instance' do user = User.save(id: 1) post = Post.new(user_id: user.id) expect(User).to receive(:where).with(id: [1]).once.and_call_original expect(post.user_lazy).to eq(user) expect(post.user_lazy).to eq(user) end it 'works even if the loaded values is nil' do post = Post.new(user_id: 1) expect(User).to receive(:where).with(id: [1]).once.and_call_original expect(post.user_lazy).to eq(nil) expect(post.user_lazy).to eq(nil) end it 'raises and error if loaded value do not have a method' do user = User.save(id: 1) post = Post.new(user_id: user.id) expect(User).to receive(:where).with(id: [1]).once.and_call_original expect { post.user_lazy.foo }.to raise_error(NoMethodError, /undefined method `foo' for #(id) do BatchLoader.for(id).batch do |user_ids, loader| User.where(id: user_ids).each { |u| loader.call(u.id, u.id) } end end batch_loader = ->(id) do BatchLoader.for(id).batch do |user_ids, loader| user_ids.each { |user_id| loader.call(user_id, nested_batch_loader.call(user_id)) } end end expect(User).to receive(:where).with(id: [1, 2]).once.and_call_original result = [batch_loader.call(1), batch_loader.call(2)] expect(result).to eq([1, 2]) end end context 'with custom key' do it 'batches multiple items by key' do author = Author.save(id: 1) reader = Reader.save(id: 2) batch_loader = ->(type, id) do BatchLoader.for(id).batch(key: type) do |ids, loader, args| args[:key].where(id: ids).each { |user| loader.call(user.id, user) } end end loader_author = batch_loader.call(Author, 1) loader_reader = batch_loader.call(Reader, 2) expect(Author).to receive(:where).with(id: [1]).once.and_call_original expect(loader_author).to eq(author) expect(Reader).to receive(:where).with(id: [2]).once.and_call_original expect(loader_reader).to eq(reader) end it 'batches multiple items with hash-identical keys' do user = Author.new(id: 1) same_user = Reader.new(id: 1) other_user = Reader.new(id: 2) post_1 = Post.save(user_id: 1, title: "First post") post_2 = Post.save(user_id: 1, title: "Second post") post_3 = Post.save(user_id: 2, title: "First post") batch_loader = ->(user, title) do BatchLoader.for(title).batch(key: user) do |titles, loader, args| args[:key].posts.select { |p| titles.include?(p.title) }.each { |post| loader.call(post.title, post) } end end loader_1 = batch_loader.call(user, "First post") loader_2 = batch_loader.call(same_user, "Second post") loader_3 = batch_loader.call(other_user, "First post") expect(user).to receive(:posts).once.and_call_original expect(same_user).not_to receive(:posts) expect(other_user).to receive(:posts).once.and_call_original expect(loader_1).to eq(post_1) expect(loader_2).to eq(post_2) expect(loader_3).to eq(post_3) end end context 'loader' do it 'loads the data even in a separate thread' do lazy = BatchLoader.for(1).batch do |nums, loader| threads = nums.map do |num| Thread.new { loader.call(num, num + 1) } end threads.each(&:join) end expect(lazy).to eq(2) end it 'is thread-safe' do batch_block = Proc.new do |ids, loader| ids.each do |id| thread = Thread.new { loader.call(id) { |value| value << id } } loader.call(id) { |value| value << id + 1 } thread.join end end slow_executor_proxy = SlowExecutorProxy.new([], nil, &batch_block) lazy = BatchLoader.new(item: 1, executor_proxy: slow_executor_proxy).batch(default_value: [], &batch_block) expect(lazy).to match_array([1, 2]) end it 'supports alternative default values' do lazy = BatchLoader.for(1).batch(default_value: 123) do |nums, loader| # No-op, so default is returned end expect(lazy).to eq(123) end it 'supports memoizing repeated calls to the same item, via a block' do lazy = BatchLoader.for(1).batch(default_value: []) do |nums, loader| nums.each do |num| loader.call(num) { |memo| memo.push(num) } loader.call(num) { |memo| memo.push(num + 1) } loader.call(num) { |memo| memo.push(num + 2) } end end expect(lazy).to eq([1,2,3]) end it 'raises ArgumentError if called with block and value' do lazy = BatchLoader.for(1).batch do |nums, loader| nums.each do |num| loader.call(num, "one value") { "too many values" } end end expect { lazy.sync }.to raise_error(ArgumentError) end it 'raises ArgumentError if called without block and value' do lazy = BatchLoader.for(1).batch do |nums, loader| nums.each { |num| loader.call(num) } end expect { lazy.sync }.to raise_error(ArgumentError) end end describe '#inspect' do it 'returns BatchLoader without syncing and delegates #inspect after' do user = User.save(id: 1) post = Post.new(user_id: user.id) batch_loader = post.user_lazy expect(batch_loader.inspect).to match(/#/) expect(batch_loader.to_s).to match(/#/) expect(batch_loader.inspect).to match(/#/) end end describe '#respond_to?' do let(:user) { User.save(id: 1) } let(:post) { Post.new(user_id: user.id) } subject { post.user_lazy } it 'syncs the object just once' do loaded_user = post.user_lazy expect(loaded_user.respond_to?(:id)).to eq(true) end it 'returns false for private methods by default' do expect(subject.respond_to?(:some_private_method)).to eq(false) end it 'returns true for private methods if include_private flag is true' do expect(subject.respond_to?(:some_private_method, true)).to eq(true) end it 'does not depend on the loaded value #method_missing' do expect(user).not_to receive(:method_missing) expect(subject).to respond_to(:id) end context 'when the cache and method replacement is disabled' do it 'syncs the object on every call' do loaded_user = post.user_lazy(cache: false, replace_methods: false) expect(User).to receive(:where).with(id: [1]).twice.and_call_original loaded_user.respond_to?(:id) loaded_user.respond_to?(:id) end end end describe '#batch' do it 'delegates the second batch call to the loaded value' do user = User.save(id: 1) post = Post.new(user_id: user.id) expect(post.user_lazy.batch).to eq("Batch from User") end it 'works without cache between different BatchLoader instances for the same item' do user1 = User.save(id: 1) user2 = User.save(id: 2) post = Post.new(user_id: user1.id) expect(User).to receive(:where).with(id: [1]).once.and_call_original expect(post.user_lazy).to eq(user1) post.user_id = user2.id expect(User).to receive(:where).with(id: [2]).once.and_call_original expect(post.user_lazy(cache: false)).to eq(user2) end it 'works without cache and method replacement for the same BatchLoader instance' do user = User.save(id: 1) post = Post.new(user_id: user.id) user_lazy = post.user_lazy(cache: false, replace_methods: false) expect(User).to receive(:where).with(id: [1]).twice.and_call_original expect(user_lazy).to eq(user) expect(user_lazy).to eq(user) end it 'does not replace methods when replace_methods is false' do user = User.save(id: 1) post = Post.new(user_id: user.id) user_lazy = post.user_lazy(cache: true, replace_methods: false) expect(user_lazy).to receive(:method_missing).and_call_original user_lazy.id end it 'does not allow mutating a list of items' do batch_loader = BatchLoader.for(1).batch do |items, loader| items.map! { |i| i - 1 } end expect { batch_loader.to_s }.to raise_error(RuntimeError, "can't modify frozen Array") end it 'raises the error if something went wrong in the batch' do result = BatchLoader.for(1).batch { |ids, loader| raise "Oops" } # should work event with Pry which currently shallows errors on #inspect call https://github.com/pry/pry/issues/1642 # require 'pry'; binding.pry expect { result.to_s }.to raise_error("Oops") expect { result.to_s }.to raise_error("Oops") end end end ruby-batch-loader-1.4.1+dfsg.1/spec/benchmarks/000077500000000000000000000000001351114611600211455ustar00rootroot00000000000000ruby-batch-loader-1.4.1+dfsg.1/spec/benchmarks/batching.rb000066400000000000000000000035421351114611600232550ustar00rootroot00000000000000# frozen_string_literal: true # Usage: ruby spec/benchmarks/batching.rb require 'benchmark/ips' require_relative "../../lib/batch_loader" require_relative "../fixtures/models" User.save(id: 1) def batch_loader BatchLoader.for(1).batch do |ids, loader| User.where(id: ids).each { |user| loader.call(user.id, user) } end end batch_loader_with_cache = batch_loader batch_loader_without_cache = BatchLoader.for(1).batch(cache: false) do |ids, loader| User.where(id: ids).each { |user| loader.call(user.id, user) } end Benchmark.ips do |x| x.config(time: 5, warmup: 0) x.report("pure") { User.where(id: [1]).first.id } x.report("already synced") { batch_loader_with_cache.id } x.report("with cache") { batch_loader.id } x.report("with purged cache") { batch_loader.id ; BatchLoader::Executor.clear_current } x.report("without cache") { batch_loader_without_cache.id } x.compare! end # Warming up -------------------------------------- # pure 1.000 i/100ms # already synced 1.000 i/100ms # with cache 1.000 i/100ms # with purged cache 1.000 i/100ms # without cache 1.000 i/100ms # Calculating ------------------------------------- # pure 960.344k (±16.1%) i/s - 3.283M # already synced 989.078k (± 9.0%) i/s - 3.200M # with cache 8.400k (±20.1%) i/s - 39.442k in 4.982755s # with purged cache 7.218k (±19.0%) i/s - 34.015k in 4.981920s # without cache 76.683k (±19.5%) i/s - 345.987k in 4.874809s # Comparison: # already synced: 989077.6 i/s # pure: 960343.8 i/s - same-ish: difference falls within error # without cache: 76682.6 i/s - 12.90x slower # with cache: 8399.6 i/s - 117.75x slower # with purged cache: 7218.5 i/s - 137.02x slower ruby-batch-loader-1.4.1+dfsg.1/spec/benchmarks/caching.rb000066400000000000000000000051501351114611600230670ustar00rootroot00000000000000# Usage: ruby spec/benchmarks/caching.rb # no replacement + many methods _can_ be faster than replacement + many # methods, but this depends on the values of METHOD_COUNT, OBJECT_COUNT, # and CALL_COUNT, so tweak them for your own scenario! require 'benchmark' require_relative '../../lib/batch_loader' METHOD_COUNT = 1000 # methods on the object with a large interface OBJECT_COUNT = 1000 # objects in the batch CALL_COUNT = 1000 # times a method is called on the loaded object class ManyMethods 1.upto(METHOD_COUNT) do |i| define_method("method_#{i}") { i } end end class FewMethods def method_1 1 end end def load_value(x, **opts) BatchLoader.for(x).batch(opts) do |xs, loader| xs.each { |x| loader.call(x, x) } end end def benchmark(klass:, **opts) OBJECT_COUNT.times do value = load_value(klass.new, opts) CALL_COUNT.times { value.method_1 } end end Benchmark.bmbm do |x| x.report('replacement + many methods') { benchmark(klass: ManyMethods) } x.report('replacement + few methods') { benchmark(klass: FewMethods) } x.report('no replacement + many methods') { benchmark(klass: ManyMethods, replace_methods: false) } x.report('no replacement + few methods') { benchmark(klass: FewMethods, replace_methods: false) } x.report('no cache + many methods') { benchmark(klass: ManyMethods, cache: false, replace_methods: false) } x.report('no cache + few methods') { benchmark(klass: FewMethods, cache: false, replace_methods: false) } end # Rehearsal ----------------------------------------------------------------- # replacement + many methods 2.260000 0.030000 2.290000 ( 2.603038) # replacement + few methods 0.450000 0.000000 0.450000 ( 0.457151) # no replacement + many methods 0.440000 0.010000 0.450000 ( 0.454444) # no replacement + few methods 0.370000 0.000000 0.370000 ( 0.380699) # no cache + many methods 31.780000 0.240000 32.020000 ( 33.552620) # no cache + few methods 31.510000 0.200000 31.710000 ( 32.294752) # ------------------------------------------------------- total: 67.290000sec # user system total real # replacement + many methods 2.330000 0.010000 2.340000 ( 2.382599) # replacement + few methods 0.430000 0.000000 0.430000 ( 0.438584) # no replacement + many methods 0.420000 0.000000 0.420000 ( 0.434069) # no replacement + few methods 0.440000 0.010000 0.450000 ( 0.452091) # no cache + many methods 31.630000 0.160000 31.790000 ( 32.337531) # no cache + few methods 36.590000 0.370000 36.960000 ( 40.701712) ruby-batch-loader-1.4.1+dfsg.1/spec/benchmarks/graphql.rb000066400000000000000000000012611351114611600231300ustar00rootroot00000000000000# frozen_string_literal: true # Usage: ruby spec/benchmarks/graphql.rb && open tmp/stack.html require 'ruby-prof' require "graphql" require_relative "../../lib/batch_loader" require_relative "../fixtures/models" require_relative "../fixtures/graphql_schema" iterations = Array.new(2_000) iterations.each_with_index do |_, i| user = User.save(id: i) Post.save(user_id: user.id) end query = "{ posts { user { id } } }" RubyProf.measure_mode = RubyProf::WALL_TIME RubyProf.start GraphqlSchema.execute(query) # 0.45, 0.52, 0.47 sec result = RubyProf.stop stack_printer = RubyProf::CallStackPrinter.new(result) File.open("tmp/stack.html", 'w') { |file| stack_printer.print(file) } ruby-batch-loader-1.4.1+dfsg.1/spec/benchmarks/profiling.rb000066400000000000000000000012171351114611600234640ustar00rootroot00000000000000# frozen_string_literal: true # Usage: ruby spec/benchmarks/profiling.rb && open tmp/stack.html require 'ruby-prof' require_relative "../../lib/batch_loader" require_relative "../fixtures/models" User.save(id: 1) iterations = Array.new(5_000) def batch_loader BatchLoader.for(1).batch do |ids, loader| User.where(id: ids).each { |user| loader.call(user.id, user) } end end RubyProf.measure_mode = RubyProf::WALL_TIME RubyProf.start iterations.each { batch_loader.id } # 2.46, 2.87, 2.56 sec result = RubyProf.stop stack_printer = RubyProf::CallStackPrinter.new(result) File.open("tmp/stack.html", 'w') { |file| stack_printer.print(file) } ruby-batch-loader-1.4.1+dfsg.1/spec/fixtures/000077500000000000000000000000001351114611600207015ustar00rootroot00000000000000ruby-batch-loader-1.4.1+dfsg.1/spec/fixtures/graphql_schema.rb000066400000000000000000000032651351114611600242120ustar00rootroot00000000000000case ENV['GRAPHQL_RUBY_VERSION'] when '1_7' UserType = GraphQL::ObjectType.define do name "User" field :id, !types.ID end PostType = GraphQL::ObjectType.define do name "Post" field :user, !UserType, resolve: ->(object, args, ctx) do BatchLoader::GraphQL.for(object.user_id).batch do |user_ids, loader| User.where(id: user_ids).each { |user| loader.call(user.id, user) } end end field :userOld, !UserType, resolve: ->(object, args, ctx) do BatchLoader.for(object.user_id).batch do |user_ids, loader| User.where(id: user_ids).each { |user| loader.call(user.id, user) } end end end QueryType = GraphQL::ObjectType.define do name "Query" field :posts, !types[PostType], resolve: ->(obj, args, ctx) { Post.all } end GraphqlSchema = GraphQL::Schema.define do query QueryType use BatchLoader::GraphQL end when '1_8' class UserType < GraphQL::Schema::Object field :id, ID, null: false end class PostType < GraphQL::Schema::Object field :user, UserType, null: false field :user_old, UserType, null: false def user BatchLoader::GraphQL.for(object.user_id).batch do |user_ids, loader| User.where(id: user_ids).each { |user| loader.call(user.id, user) } end end def user_old BatchLoader.for(object.user_id).batch do |user_ids, loader| User.where(id: user_ids).each { |user| loader.call(user.id, user) } end end end class QueryType < GraphQL::Schema::Object field :posts, [PostType], null: false def posts Post.all end end class GraphqlSchema < GraphQL::Schema query QueryType use BatchLoader::GraphQL end end ruby-batch-loader-1.4.1+dfsg.1/spec/fixtures/models.rb000066400000000000000000000026751351114611600225230ustar00rootroot00000000000000class Post attr_accessor :user_id, :title class << self def save(user_id:, title: nil) ensure_init_store new(user_id: user_id, title: title).tap { |post| @posts << post } end def all ensure_init_store @posts end def destroy_all @posts = [] end private def ensure_init_store @posts ||= [] end end def initialize(user_id:, title: nil) self.user_id = user_id self.title = title || "Untitled" end def user_lazy(**opts) BatchLoader.for(user_id).batch(**opts) do |user_ids, loader| User.where(id: user_ids).each { |user| loader.call(user.id, user) } end end end class User class << self def save(id:) ensure_init_store @store[self][id] = new(id: id) end def where(id:) ensure_init_store @store[self].each_with_object([]) { |(k, v), memo| memo << v if id.include?(k) } end def destroy_all ensure_init_store @store[self] = {} end private def ensure_init_store @store ||= Hash.new { |h, k| h[k] = {} } end end attr_reader :id def initialize(id:) @id = id end def batch "Batch from User" end def hash [User, id].hash end def posts Post.all.select { |p| p.user_id == id } end def eql?(other) other.is_a?(User) && id == other.id end private def some_private_method end end class Author < User end class Reader < User end ruby-batch-loader-1.4.1+dfsg.1/spec/graphql_spec.rb000066400000000000000000000012561351114611600220310ustar00rootroot00000000000000require "spec_helper" RSpec.describe 'GraphQL integration' do it 'resolves BatchLoader fields lazily' do user1 = User.save(id: "1") user2 = User.save(id: "2") Post.save(user_id: user1.id) Post.save(user_id: user2.id) query = <<~QUERY { posts { user { id } userOld { id } } } QUERY expect(User).to receive(:where).with(id: ["1", "2"]).twice.and_call_original result = GraphqlSchema.execute(query) expect(result['data']).to eq({ 'posts' => [ {'user' => {'id' => "1"}, 'userOld' => {'id' => "1"}}, {'user' => {'id' => "2"}, 'userOld' => {'id' => "2"}} ] }) end end ruby-batch-loader-1.4.1+dfsg.1/spec/spec_helper.rb000066400000000000000000000014651351114611600216540ustar00rootroot00000000000000require "bundler/setup" ENV['GRAPHQL_RUBY_VERSION'] ||= '1_8' if ENV['CI'] require 'coveralls' Coveralls.wear! end require_relative "../lib/batch_loader" require "graphql" require_relative "./fixtures/models" require_relative "./fixtures/graphql_schema" class SlowExecutorProxy < BatchLoader::ExecutorProxy def value_loaded?(item:) result = loaded.key?(item) sleep 0.5 result end end RSpec.configure do |config| # Enable flags like --only-failures and --next-failure config.example_status_persistence_file_path = ".rspec_status" # Disable RSpec exposing methods globally on `Module` and `main` config.disable_monkey_patching! config.order = :random config.expect_with :rspec do |c| c.syntax = :expect end config.after do BatchLoader::Executor.clear_current end end