Now, to connect to the Iguana API, I needed to know the credentials to access it, so that meant building out the configuration form. 'Preserves the raw data downloaded from the Iguana API for comparison.', 'fields' =>, 'data' =>, ], 'primary key' =>, ] $schema =, 'data' =>, ], 'primary key' =>, ] return $schema } The settings form for configuring access to the API and such. The import form will trigger a batch job to import and save all of the API's Tea data. The overview will be used to display the current health of the API. So, first I stubbed out administrative pages consisting of an overview page, an import form, and a settings form. For the sake of this blog, the API will be referred to as the Iguana API, the products will be Tea, and client, the Iguana Tea Company. So, I premised that we could do a full import via an administration page and cron can keep the data up to date utilizing the cron queue with help from Ultimate Cron. Since Drupal is known to be cumbersome with content saving, the idea of pulling in and saving data from for 5000 nodes in one go during cron didn't seem feasible. Updates to their external dataset would be done at a steady pace, like a handful of times a week, and would include a low number of changes, around 50 to 100 items a change. The premise was that the data was to be downloaded at regular intervals, so that content editors didn't need to copy and past to keep product information up to date. Now, this scenario was lucky for us, the API provides a total item count of about 5000, but when queried with a start date, it provides all revisions of items between then and now. Enjoy!įor a recent project, we were tasked to consume the client's internal data from a custom API. This article is one of Metal Toad's Top 20 Drupal Tips. 5 Signs You Need AWS Cloud Migration Consulting.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |