public inbox for openembedded-core@lists.openembedded.org
 help / color / mirror / Atom feed
* [PATCH 00/15] Developer workflow tools
@ 2014-12-19 11:41 Paul Eggleton
  2014-12-19 11:41 ` [PATCH 01/15] meta-environment: don't mark tasks as nostamp Paul Eggleton
                   ` (15 more replies)
  0 siblings, 16 replies; 17+ messages in thread
From: Paul Eggleton @ 2014-12-19 11:41 UTC (permalink / raw)
  To: openembedded-core

I've been looking at how to make it easier for application and system
component developers to get their work done using the tools we provide,
and I believe this patchset is a piece of the solution. There's still a
number of other pieces to come, but this should be usable on its own.

The first part of this patchset makes some changes to OE to accommodate
the tools. One of these is to extend the PATCHTOOL = "git" code to
ensure that we're able to apply all patches even if "git am" and
"git apply" can't handle them, that we do a commit to the repository per
patch, and that there is a way to track these commits back to the patch
from which they were added. There are also some additions to shared
library code that may well be useful for other tools.

I've then added a new recipe auto-creation script, recipetool, which can
take a source tree or URL and create a skeleton recipe to build it.
(In case anyone is wondering about the existing scripts/create-recipe,
frankly I consider it a dead end - it's written in Perl, which makes it
a bit difficult to integrate with the rest of our code; it's also
GPLv3-only which makes any such integration pretty much impossible from
another angle.)

Then we add devtool. This allows you to:

 * Add a new piece of software (auto-create the skeleton of a recipe
   using the aforementioned recipetool and point the build to an
   external source tree)
 * Modify the source for an existing recipe (point the build to an
   external source tree, possibly creating that tree in the same step
   and managing it with git)
 * Deploy the installed files from a recipe from ${D} to a target using
   scp.
 * Apply any modifications to the commits in the external source tree
   back to the recipe (by updating/adding/removing patches and updating
   SRC_URI within the recipe).

There will obviously be extensions to these tools, but I hope they are
already functional enough to be useful at the state they are in at the
moment.

(Note that this series depends on another series of patches to BitBake 
sent at the same time - these can be found on the paule/devtool-bb2
branch on poky-contrib.)


Changes since the earlier RFC:

For those that looked at the earlier RFC series, there have been quite a
few improvements since that, almost too many to list - but other than
what's visible in the actual commits, here goes:

* recipetool now uses autoscan to pick up dependencies from Makefile-only
  software
* recipetool is much smarter about library dependencies (making use of
  collected shared lib provider and pkgdata information rather than a
  limited hardcoded map) and detects more licenses.
* devtool now creates  workspace layer as needed if it doesn't exist
* devtool can now write back commits to patches next to the original
  recipe
* Compilation of recipes in the workspace now happens every time rather
  than just once, and causes dependent tasks (e.g. images) to be rebuilt
  as well.
* recipetool: if we can't find any license files at all, set LICENSE to
  "CLOSED" (with a comment warning it needs to be set more appropriately)
  so that the user can at least start building the recipe.
* devtool can extract the source for linux-yocto but externalsrc usage
  may still be problematic - something that still needs to be addressed.
* Branch and tag the external source git repository
* Improved error reporting, help and general messages

(Thanks to Trevor Woerner and others for earlier testing & feedback.)


The following changes since commit 8d0e56a850579f9a6d501266deeef9b257ce4780:

  serf: readded md5sum (2014-12-11 11:34:22 +0000)

are available in the git repository at:

  git://git.openembedded.org/openembedded-core-contrib paule/devtool
  http://cgit.openembedded.org/cgit.cgi/openembedded-core-contrib/log/?h=paule/devtool

Junchun Guan (1):
  scripts/devtool: Support deploy/undeploy function

Paul Eggleton (14):
  meta-environment: don't mark tasks as nostamp
  classes/package: move read_shlib_providers() to a common unit
  lib/oe/patch: fall back to patch if git apply fails
  lib/oe/patch: auto-commit when falling back from git am
  lib/oe/patch: use --keep-cr with git am
  lib/oe/patch.py: abort "git am" if it fails
  lib/oe/patch: add support for extracting patches from git tree
  lib/oe: add recipeutils module
  oeqa/utils: make get_bb_var() more reliable
  classes/externalsrc: set do_compile as nostamp
  scripts/recipetool: Add a recipe auto-creation script
  scripts: add scriptutils module
  scripts/devtool: add development helper tool
  devtool: add QA tests

 meta/classes/externalsrc.bbclass           |   3 +
 meta/classes/package.bbclass               |  24 +-
 meta/lib/oe/package.py                     |  26 ++
 meta/lib/oe/patch.py                       | 158 ++++++++-
 meta/lib/oe/recipeutils.py                 | 279 +++++++++++++++
 meta/lib/oeqa/selftest/devtool.py          | 239 +++++++++++++
 meta/lib/oeqa/utils/commands.py            |   4 +-
 meta/recipes-core/meta/meta-environment.bb |   2 -
 scripts/devtool                            | 255 ++++++++++++++
 scripts/lib/devtool/__init__.py            |  78 +++++
 scripts/lib/devtool/deploy.py              | 100 ++++++
 scripts/lib/devtool/standard.py            | 545 +++++++++++++++++++++++++++++
 scripts/lib/recipetool/__init__.py         |   0
 scripts/lib/recipetool/create.py           | 413 ++++++++++++++++++++++
 scripts/lib/recipetool/create_buildsys.py  | 319 +++++++++++++++++
 scripts/lib/scriptutils.py                 |  60 ++++
 scripts/recipetool                         |  99 ++++++
 17 files changed, 2572 insertions(+), 32 deletions(-)
 create mode 100644 meta/lib/oe/recipeutils.py
 create mode 100644 meta/lib/oeqa/selftest/devtool.py
 create mode 100755 scripts/devtool
 create mode 100644 scripts/lib/devtool/__init__.py
 create mode 100644 scripts/lib/devtool/deploy.py
 create mode 100644 scripts/lib/devtool/standard.py
 create mode 100644 scripts/lib/recipetool/__init__.py
 create mode 100644 scripts/lib/recipetool/create.py
 create mode 100644 scripts/lib/recipetool/create_buildsys.py
 create mode 100644 scripts/lib/scriptutils.py
 create mode 100755 scripts/recipetool

-- 
1.9.3



^ permalink raw reply	[flat|nested] 17+ messages in thread

* [PATCH 01/15] meta-environment: don't mark tasks as nostamp
  2014-12-19 11:41 [PATCH 00/15] Developer workflow tools Paul Eggleton
@ 2014-12-19 11:41 ` Paul Eggleton
  2014-12-19 11:41 ` [PATCH 02/15] classes/package: move read_shlib_providers() to a common unit Paul Eggleton
                   ` (14 subsequent siblings)
  15 siblings, 0 replies; 17+ messages in thread
From: Paul Eggleton @ 2014-12-19 11:41 UTC (permalink / raw)
  To: openembedded-core

With siggen being changed to alter the signature of nostamp tasks on the
fly, having these tasks as nostamp results in the SDK being rebuilt
every time, which is not desirable. In any case this is just legacy from
the days before we used signatures to take care of ensuring these tasks
get re-run when they need to be.

Signed-off-by: Paul Eggleton <paul.eggleton@linux.intel.com>
---
 meta/recipes-core/meta/meta-environment.bb | 2 --
 1 file changed, 2 deletions(-)

diff --git a/meta/recipes-core/meta/meta-environment.bb b/meta/recipes-core/meta/meta-environment.bb
index bb208a3..b3737bb 100644
--- a/meta/recipes-core/meta/meta-environment.bb
+++ b/meta/recipes-core/meta/meta-environment.bb
@@ -19,7 +19,6 @@ SDKTARGETSYSROOT = "${SDKPATH}/sysroots/${REAL_MULTIMACH_TARGET_SYS}"
 
 inherit cross-canadian
 
-do_generate_content[nostamp] = "1"
 do_generate_content[cleandirs] = "${SDK_OUTPUT}"
 do_generate_content[dirs] = "${SDK_OUTPUT}/${SDKPATH}"
 python do_generate_content() {
@@ -58,7 +57,6 @@ create_sdk_files() {
 	toolchain_create_sdk_version ${SDK_OUTPUT}/${SDKPATH}/version-${REAL_MULTIMACH_TARGET_SYS}
 }
 
-do_install[nostamp] = "1"
 do_install() {
     install -d ${D}/${SDKPATH}
     install -m 0644 -t ${D}/${SDKPATH} ${SDK_OUTPUT}/${SDKPATH}/*
-- 
1.9.3



^ permalink raw reply related	[flat|nested] 17+ messages in thread

* [PATCH 02/15] classes/package: move read_shlib_providers() to a common unit
  2014-12-19 11:41 [PATCH 00/15] Developer workflow tools Paul Eggleton
  2014-12-19 11:41 ` [PATCH 01/15] meta-environment: don't mark tasks as nostamp Paul Eggleton
@ 2014-12-19 11:41 ` Paul Eggleton
  2014-12-19 11:41 ` [PATCH 03/15] lib/oe/patch: fall back to patch if git apply fails Paul Eggleton
                   ` (13 subsequent siblings)
  15 siblings, 0 replies; 17+ messages in thread
From: Paul Eggleton @ 2014-12-19 11:41 UTC (permalink / raw)
  To: openembedded-core

This allows us to use this function elsewhere in the code.

Signed-off-by: Paul Eggleton <paul.eggleton@linux.intel.com>
---
 meta/classes/package.bbclass | 24 +-----------------------
 meta/lib/oe/package.py       | 26 ++++++++++++++++++++++++++
 2 files changed, 27 insertions(+), 23 deletions(-)

diff --git a/meta/classes/package.bbclass b/meta/classes/package.bbclass
index 89cce40..692fbac 100644
--- a/meta/classes/package.bbclass
+++ b/meta/classes/package.bbclass
@@ -1389,32 +1389,11 @@ python package_do_shlibs() {
 
     pkgdest = d.getVar('PKGDEST', True)
 
-    shlibs_dirs = d.getVar('SHLIBSDIRS', True).split()
     shlibswork_dir = d.getVar('SHLIBSWORKDIR', True)
 
     # Take shared lock since we're only reading, not writing
     lf = bb.utils.lockfile(d.expand("${PACKAGELOCK}"))
 
-    def read_shlib_providers():
-        list_re = re.compile('^(.*)\.list$')
-        # Go from least to most specific since the last one found wins
-        for dir in reversed(shlibs_dirs):
-            bb.debug(2, "Reading shlib providers in %s" % (dir))
-            if not os.path.exists(dir):
-                continue
-            for file in os.listdir(dir):
-                m = list_re.match(file)
-                if m:
-                    dep_pkg = m.group(1)
-                    fd = open(os.path.join(dir, file))
-                    lines = fd.readlines()
-                    fd.close()
-                    for l in lines:
-                        s = l.strip().split(":")
-                        if s[0] not in shlib_provider:
-                            shlib_provider[s[0]] = {}
-                        shlib_provider[s[0]][s[1]] = (dep_pkg, s[2])
-
     def linux_so(file, needed, sonames, renames, pkgver):
         needs_ldconfig = False
         ldir = os.path.dirname(file).replace(pkgdest + "/" + pkg, '')
@@ -1511,8 +1490,7 @@ python package_do_shlibs() {
         use_ldconfig = False
 
     needed = {}
-    shlib_provider = {}
-    read_shlib_providers()
+    shlib_provider = oe.package.read_shlib_providers(d)
 
     for pkg in packages.split():
         private_libs = d.getVar('PRIVATE_LIBS_' + pkg, True) or d.getVar('PRIVATE_LIBS', True) or ""
diff --git a/meta/lib/oe/package.py b/meta/lib/oe/package.py
index f8b5322..ea6feaa 100644
--- a/meta/lib/oe/package.py
+++ b/meta/lib/oe/package.py
@@ -97,3 +97,29 @@ def filedeprunner(arg):
         raise e
 
     return (pkg, provides, requires)
+
+
+def read_shlib_providers(d):
+    import re
+
+    shlib_provider = {}
+    shlibs_dirs = d.getVar('SHLIBSDIRS', True).split()
+    list_re = re.compile('^(.*)\.list$')
+    # Go from least to most specific since the last one found wins
+    for dir in reversed(shlibs_dirs):
+        bb.debug(2, "Reading shlib providers in %s" % (dir))
+        if not os.path.exists(dir):
+            continue
+        for file in os.listdir(dir):
+            m = list_re.match(file)
+            if m:
+                dep_pkg = m.group(1)
+                fd = open(os.path.join(dir, file))
+                lines = fd.readlines()
+                fd.close()
+                for l in lines:
+                    s = l.strip().split(":")
+                    if s[0] not in shlib_provider:
+                        shlib_provider[s[0]] = {}
+                    shlib_provider[s[0]][s[1]] = (dep_pkg, s[2])
+    return shlib_provider
-- 
1.9.3



^ permalink raw reply related	[flat|nested] 17+ messages in thread

* [PATCH 03/15] lib/oe/patch: fall back to patch if git apply fails
  2014-12-19 11:41 [PATCH 00/15] Developer workflow tools Paul Eggleton
  2014-12-19 11:41 ` [PATCH 01/15] meta-environment: don't mark tasks as nostamp Paul Eggleton
  2014-12-19 11:41 ` [PATCH 02/15] classes/package: move read_shlib_providers() to a common unit Paul Eggleton
@ 2014-12-19 11:41 ` Paul Eggleton
  2014-12-19 11:41 ` [PATCH 04/15] lib/oe/patch: auto-commit when falling back from git am Paul Eggleton
                   ` (12 subsequent siblings)
  15 siblings, 0 replies; 17+ messages in thread
From: Paul Eggleton @ 2014-12-19 11:41 UTC (permalink / raw)
  To: openembedded-core

When PATCHTOOL = "git", git apply doesn't support fuzzy application, so
if a patch requires that it's better to be able to apply it rather than
just failing.

Signed-off-by: Paul Eggleton <paul.eggleton@linux.intel.com>
---
 meta/lib/oe/patch.py | 6 +++++-
 1 file changed, 5 insertions(+), 1 deletion(-)

diff --git a/meta/lib/oe/patch.py b/meta/lib/oe/patch.py
index b085c9d..788f465 100644
--- a/meta/lib/oe/patch.py
+++ b/meta/lib/oe/patch.py
@@ -219,7 +219,11 @@ class GitApplyTree(PatchTree):
             return _applypatchhelper(shellcmd, patch, force, reverse, run)
         except CmdError:
             shellcmd = ["git", "--git-dir=.", "apply", "-p%s" % patch['strippath']]
-            return _applypatchhelper(shellcmd, patch, force, reverse, run)
+            try:
+                output = _applypatchhelper(shellcmd, patch, force, reverse, run)
+            except CmdError:
+                output = PatchTree._applypatch(self, patch, force, reverse, run)
+            return output
 
 
 class QuiltTree(PatchSet):
-- 
1.9.3



^ permalink raw reply related	[flat|nested] 17+ messages in thread

* [PATCH 04/15] lib/oe/patch: auto-commit when falling back from git am
  2014-12-19 11:41 [PATCH 00/15] Developer workflow tools Paul Eggleton
                   ` (2 preceding siblings ...)
  2014-12-19 11:41 ` [PATCH 03/15] lib/oe/patch: fall back to patch if git apply fails Paul Eggleton
@ 2014-12-19 11:41 ` Paul Eggleton
  2014-12-19 11:41 ` [PATCH 05/15] lib/oe/patch: use --keep-cr with " Paul Eggleton
                   ` (11 subsequent siblings)
  15 siblings, 0 replies; 17+ messages in thread
From: Paul Eggleton @ 2014-12-19 11:41 UTC (permalink / raw)
  To: openembedded-core

When PATCHTOOL = "git", if we're not able to use "git am" to apply a
patch and fall back to "git apply" or "patch", it is desirable to
actually commit the changes, attempting to preserve (and interpret) the
patch header as part of the commit message if present. As a bonus, the
code for extracting the commit message is callable externally in case it
is useful elsewhere.

Signed-off-by: Paul Eggleton <paul.eggleton@linux.intel.com>
---
 meta/lib/oe/patch.py | 86 ++++++++++++++++++++++++++++++++++++++++++++++++++++
 1 file changed, 86 insertions(+)

diff --git a/meta/lib/oe/patch.py b/meta/lib/oe/patch.py
index 788f465..2d56ba4 100644
--- a/meta/lib/oe/patch.py
+++ b/meta/lib/oe/patch.py
@@ -202,6 +202,78 @@ class GitApplyTree(PatchTree):
     def __init__(self, dir, d):
         PatchTree.__init__(self, dir, d)
 
+    @staticmethod
+    def extractPatchHeader(patchfile):
+        """
+        Extract just the header lines from the top of a patch file
+        """
+        lines = []
+        with open(patchfile, 'r') as f:
+            for line in f.readlines():
+                if line.startswith('Index: ') or line.startswith('diff -') or line.startswith('---'):
+                    break
+                lines.append(line)
+        return lines
+
+    @staticmethod
+    def prepareCommit(patchfile):
+        """
+        Prepare a git commit command line based on the header from a patch file
+        (typically this is useful for patches that cannot be applied with "git am" due to formatting)
+        """
+        import tempfile
+        import re
+        author_re = re.compile('[\S ]+ <\S+@\S+\.\S+>')
+        # Process patch header and extract useful information
+        lines = GitApplyTree.extractPatchHeader(patchfile)
+        outlines = []
+        author = None
+        date = None
+        for line in lines:
+            if line.startswith('Subject: '):
+                subject = line.split(':', 1)[1]
+                # Remove any [PATCH][oe-core] etc.
+                subject = re.sub(r'\[.+?\]\s*', '', subject)
+                outlines.insert(0, '%s\n\n' % subject.strip())
+                continue
+            if line.startswith('From: ') or line.startswith('Author: '):
+                authorval = line.split(':', 1)[1].strip().replace('"', '')
+                # git is fussy about author formatting i.e. it must be Name <email@domain>
+                if author_re.match(authorval):
+                    author = authorval
+                    continue
+            if line.startswith('Date: '):
+                if date is None:
+                    dateval = line.split(':', 1)[1].strip()
+                    # Very crude check for date format, since git will blow up if it's not in the right
+                    # format. Without e.g. a python-dateutils dependency we can't do a whole lot more
+                    if len(dateval) > 12:
+                        date = dateval
+                continue
+            if line.startswith('Signed-off-by: '):
+                authorval = line.split(':', 1)[1].strip().replace('"', '')
+                # git is fussy about author formatting i.e. it must be Name <email@domain>
+                if author_re.match(authorval):
+                    author = authorval
+            outlines.append(line)
+        # Add a pointer to the original patch file name
+        if outlines and outlines[-1].strip():
+            outlines.append('\n')
+        outlines.append('(from original patch: %s)\n' % os.path.basename(patchfile))
+        # Write out commit message to a file
+        with tempfile.NamedTemporaryFile('w', delete=False) as tf:
+            tmpfile = tf.name
+            for line in outlines:
+                tf.write(line)
+        # Prepare git command
+        cmd = ["git", "commit", "-F", tmpfile]
+        # git doesn't like plain email addresses as authors
+        if author and '<' in author:
+            cmd.append('--author="%s"' % author)
+        if date:
+            cmd.append('--date="%s"' % date)
+        return (tmpfile, cmd)
+
     def _applypatch(self, patch, force = False, reverse = False, run = True):
         def _applypatchhelper(shellcmd, patch, force = False, reverse = False, run = True):
             if reverse:
@@ -218,11 +290,25 @@ class GitApplyTree(PatchTree):
             shellcmd = ["git", "--work-tree=.", "am", "-3", "-p%s" % patch['strippath']]
             return _applypatchhelper(shellcmd, patch, force, reverse, run)
         except CmdError:
+            # Fall back to git apply
             shellcmd = ["git", "--git-dir=.", "apply", "-p%s" % patch['strippath']]
             try:
                 output = _applypatchhelper(shellcmd, patch, force, reverse, run)
             except CmdError:
+                # Fall back to patch
                 output = PatchTree._applypatch(self, patch, force, reverse, run)
+            # Add all files
+            shellcmd = ["git", "add", "-f", "."]
+            output += runcmd(["sh", "-c", " ".join(shellcmd)], self.dir)
+            # Exclude the patches directory
+            shellcmd = ["git", "reset", "HEAD", self.patchdir]
+            output += runcmd(["sh", "-c", " ".join(shellcmd)], self.dir)
+            # Commit the result
+            (tmpfile, shellcmd) = self.prepareCommit(patch['file'])
+            try:
+                output += runcmd(["sh", "-c", " ".join(shellcmd)], self.dir)
+            finally:
+                os.remove(tmpfile)
             return output
 
 
-- 
1.9.3



^ permalink raw reply related	[flat|nested] 17+ messages in thread

* [PATCH 05/15] lib/oe/patch: use --keep-cr with git am
  2014-12-19 11:41 [PATCH 00/15] Developer workflow tools Paul Eggleton
                   ` (3 preceding siblings ...)
  2014-12-19 11:41 ` [PATCH 04/15] lib/oe/patch: auto-commit when falling back from git am Paul Eggleton
@ 2014-12-19 11:41 ` Paul Eggleton
  2014-12-19 11:41 ` [PATCH 06/15] lib/oe/patch.py: abort "git am" if it fails Paul Eggleton
                   ` (10 subsequent siblings)
  15 siblings, 0 replies; 17+ messages in thread
From: Paul Eggleton @ 2014-12-19 11:41 UTC (permalink / raw)
  To: openembedded-core

Preserving carriage returns is important where the patch contains them.

Signed-off-by: Paul Eggleton <paul.eggleton@linux.intel.com>
---
 meta/lib/oe/patch.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/meta/lib/oe/patch.py b/meta/lib/oe/patch.py
index 2d56ba4..2bf3065 100644
--- a/meta/lib/oe/patch.py
+++ b/meta/lib/oe/patch.py
@@ -287,7 +287,7 @@ class GitApplyTree(PatchTree):
             return runcmd(["sh", "-c", " ".join(shellcmd)], self.dir)
 
         try:
-            shellcmd = ["git", "--work-tree=.", "am", "-3", "-p%s" % patch['strippath']]
+            shellcmd = ["git", "--work-tree=.", "am", "-3", "--keep-cr", "-p%s" % patch['strippath']]
             return _applypatchhelper(shellcmd, patch, force, reverse, run)
         except CmdError:
             # Fall back to git apply
-- 
1.9.3



^ permalink raw reply related	[flat|nested] 17+ messages in thread

* [PATCH 06/15] lib/oe/patch.py: abort "git am" if it fails
  2014-12-19 11:41 [PATCH 00/15] Developer workflow tools Paul Eggleton
                   ` (4 preceding siblings ...)
  2014-12-19 11:41 ` [PATCH 05/15] lib/oe/patch: use --keep-cr with " Paul Eggleton
@ 2014-12-19 11:41 ` Paul Eggleton
  2014-12-19 11:41 ` [PATCH 07/15] lib/oe/patch: add support for extracting patches from git tree Paul Eggleton
                   ` (9 subsequent siblings)
  15 siblings, 0 replies; 17+ messages in thread
From: Paul Eggleton @ 2014-12-19 11:41 UTC (permalink / raw)
  To: openembedded-core

If we don't do this, you may still be in the git am resolution mode at
the end of applying patches, which is not desirable.

Signed-off-by: Paul Eggleton <paul.eggleton@linux.intel.com>
---
 meta/lib/oe/patch.py | 6 ++++++
 1 file changed, 6 insertions(+)

diff --git a/meta/lib/oe/patch.py b/meta/lib/oe/patch.py
index 2bf3065..5227781 100644
--- a/meta/lib/oe/patch.py
+++ b/meta/lib/oe/patch.py
@@ -290,6 +290,12 @@ class GitApplyTree(PatchTree):
             shellcmd = ["git", "--work-tree=.", "am", "-3", "--keep-cr", "-p%s" % patch['strippath']]
             return _applypatchhelper(shellcmd, patch, force, reverse, run)
         except CmdError:
+            # Need to abort the git am, or we'll still be within it at the end
+            try:
+                shellcmd = ["git", "--work-tree=.", "am", "--abort"]
+                runcmd(["sh", "-c", " ".join(shellcmd)], self.dir)
+            except CmdError:
+                pass
             # Fall back to git apply
             shellcmd = ["git", "--git-dir=.", "apply", "-p%s" % patch['strippath']]
             try:
-- 
1.9.3



^ permalink raw reply related	[flat|nested] 17+ messages in thread

* [PATCH 07/15] lib/oe/patch: add support for extracting patches from git tree
  2014-12-19 11:41 [PATCH 00/15] Developer workflow tools Paul Eggleton
                   ` (5 preceding siblings ...)
  2014-12-19 11:41 ` [PATCH 06/15] lib/oe/patch.py: abort "git am" if it fails Paul Eggleton
@ 2014-12-19 11:41 ` Paul Eggleton
  2014-12-19 11:41 ` [PATCH 08/15] lib/oe: add recipeutils module Paul Eggleton
                   ` (8 subsequent siblings)
  15 siblings, 0 replies; 17+ messages in thread
From: Paul Eggleton @ 2014-12-19 11:41 UTC (permalink / raw)
  To: openembedded-core

When patches from a recipe have been written out to a git tree, we also
want to be able to do the reverse so we can update the patches next to
the recipe. This is implemented by adding a comment to each commit
message (using git hooks) which we can extract later on.

Signed-off-by: Paul Eggleton <paul.eggleton@linux.intel.com>
---
 meta/lib/oe/patch.py | 112 +++++++++++++++++++++++++++++++++++++--------------
 1 file changed, 82 insertions(+), 30 deletions(-)

diff --git a/meta/lib/oe/patch.py b/meta/lib/oe/patch.py
index 5227781..b838be8 100644
--- a/meta/lib/oe/patch.py
+++ b/meta/lib/oe/patch.py
@@ -199,6 +199,8 @@ class PatchTree(PatchSet):
         self.Pop(all=True)
 
 class GitApplyTree(PatchTree):
+    patch_line_prefix = '%% original patch'
+
     def __init__(self, dir, d):
         PatchTree.__init__(self, dir, d)
 
@@ -256,10 +258,6 @@ class GitApplyTree(PatchTree):
                 if author_re.match(authorval):
                     author = authorval
             outlines.append(line)
-        # Add a pointer to the original patch file name
-        if outlines and outlines[-1].strip():
-            outlines.append('\n')
-        outlines.append('(from original patch: %s)\n' % os.path.basename(patchfile))
         # Write out commit message to a file
         with tempfile.NamedTemporaryFile('w', delete=False) as tf:
             tmpfile = tf.name
@@ -274,7 +272,35 @@ class GitApplyTree(PatchTree):
             cmd.append('--date="%s"' % date)
         return (tmpfile, cmd)
 
+    @staticmethod
+    def extractPatches(tree, startcommit, outdir):
+        import tempfile
+        import shutil
+        tempdir = tempfile.mkdtemp(prefix='oepatch')
+        try:
+            shellcmd = ["git", "format-patch", startcommit, "-o", tempdir]
+            out = runcmd(["sh", "-c", " ".join(shellcmd)], tree)
+            if out:
+                for srcfile in out.split():
+                    patchlines = []
+                    outfile = None
+                    with open(srcfile, 'r') as f:
+                        for line in f:
+                            if line.startswith(GitApplyTree.patch_line_prefix):
+                                outfile = line.split()[-1].strip()
+                                continue
+                            patchlines.append(line)
+                    if not outfile:
+                        outfile = os.path.basename(srcfile)
+                    with open(os.path.join(outdir, outfile), 'w') as of:
+                        for line in patchlines:
+                            of.write(line)
+        finally:
+            shutil.rmtree(tempdir)
+
     def _applypatch(self, patch, force = False, reverse = False, run = True):
+        import shutil
+
         def _applypatchhelper(shellcmd, patch, force = False, reverse = False, run = True):
             if reverse:
                 shellcmd.append('-R')
@@ -286,36 +312,62 @@ class GitApplyTree(PatchTree):
 
             return runcmd(["sh", "-c", " ".join(shellcmd)], self.dir)
 
+        # Add hooks which add a pointer to the original patch file name in the commit message
+        commithook = os.path.join(self.dir, '.git', 'hooks', 'commit-msg')
+        commithook_backup = commithook + '.devtool-orig'
+        applyhook = os.path.join(self.dir, '.git', 'hooks', 'applypatch-msg')
+        applyhook_backup = applyhook + '.devtool-orig'
+        if os.path.exists(commithook):
+            shutil.move(commithook, commithook_backup)
+        if os.path.exists(applyhook):
+            shutil.move(applyhook, applyhook_backup)
+        with open(commithook, 'w') as f:
+            # NOTE: the formatting here is significant; if you change it you'll also need to
+            # change other places which read it back
+            f.write('echo >> $1\n')
+            f.write('echo "%s: $PATCHFILE" >> $1\n' % GitApplyTree.patch_line_prefix)
+        os.chmod(commithook, 0755)
+        shutil.copy2(commithook, applyhook)
         try:
-            shellcmd = ["git", "--work-tree=.", "am", "-3", "--keep-cr", "-p%s" % patch['strippath']]
-            return _applypatchhelper(shellcmd, patch, force, reverse, run)
-        except CmdError:
-            # Need to abort the git am, or we'll still be within it at the end
-            try:
-                shellcmd = ["git", "--work-tree=.", "am", "--abort"]
-                runcmd(["sh", "-c", " ".join(shellcmd)], self.dir)
-            except CmdError:
-                pass
-            # Fall back to git apply
-            shellcmd = ["git", "--git-dir=.", "apply", "-p%s" % patch['strippath']]
+            patchfilevar = 'PATCHFILE="%s"' % os.path.basename(patch['file'])
             try:
-                output = _applypatchhelper(shellcmd, patch, force, reverse, run)
+                shellcmd = [patchfilevar, "git", "--work-tree=.", "am", "-3", "--keep-cr", "-p%s" % patch['strippath']]
+                return _applypatchhelper(shellcmd, patch, force, reverse, run)
             except CmdError:
-                # Fall back to patch
-                output = PatchTree._applypatch(self, patch, force, reverse, run)
-            # Add all files
-            shellcmd = ["git", "add", "-f", "."]
-            output += runcmd(["sh", "-c", " ".join(shellcmd)], self.dir)
-            # Exclude the patches directory
-            shellcmd = ["git", "reset", "HEAD", self.patchdir]
-            output += runcmd(["sh", "-c", " ".join(shellcmd)], self.dir)
-            # Commit the result
-            (tmpfile, shellcmd) = self.prepareCommit(patch['file'])
-            try:
+                # Need to abort the git am, or we'll still be within it at the end
+                try:
+                    shellcmd = ["git", "--work-tree=.", "am", "--abort"]
+                    runcmd(["sh", "-c", " ".join(shellcmd)], self.dir)
+                except CmdError:
+                    pass
+                # Fall back to git apply
+                shellcmd = ["git", "--git-dir=.", "apply", "-p%s" % patch['strippath']]
+                try:
+                    output = _applypatchhelper(shellcmd, patch, force, reverse, run)
+                except CmdError:
+                    # Fall back to patch
+                    output = PatchTree._applypatch(self, patch, force, reverse, run)
+                # Add all files
+                shellcmd = ["git", "add", "-f", "."]
+                output += runcmd(["sh", "-c", " ".join(shellcmd)], self.dir)
+                # Exclude the patches directory
+                shellcmd = ["git", "reset", "HEAD", self.patchdir]
                 output += runcmd(["sh", "-c", " ".join(shellcmd)], self.dir)
-            finally:
-                os.remove(tmpfile)
-            return output
+                # Commit the result
+                (tmpfile, shellcmd) = self.prepareCommit(patch['file'])
+                try:
+                    shellcmd.insert(0, patchfilevar)
+                    output += runcmd(["sh", "-c", " ".join(shellcmd)], self.dir)
+                finally:
+                    os.remove(tmpfile)
+                return output
+        finally:
+            os.remove(commithook)
+            os.remove(applyhook)
+            if os.path.exists(commithook_backup):
+                shutil.move(commithook_backup, commithook)
+            if os.path.exists(applyhook_backup):
+                shutil.move(applyhook_backup, applyhook)
 
 
 class QuiltTree(PatchSet):
-- 
1.9.3



^ permalink raw reply related	[flat|nested] 17+ messages in thread

* [PATCH 08/15] lib/oe: add recipeutils module
  2014-12-19 11:41 [PATCH 00/15] Developer workflow tools Paul Eggleton
                   ` (6 preceding siblings ...)
  2014-12-19 11:41 ` [PATCH 07/15] lib/oe/patch: add support for extracting patches from git tree Paul Eggleton
@ 2014-12-19 11:41 ` Paul Eggleton
  2014-12-19 11:41 ` [PATCH 09/15] oeqa/utils: make get_bb_var() more reliable Paul Eggleton
                   ` (7 subsequent siblings)
  15 siblings, 0 replies; 17+ messages in thread
From: Paul Eggleton @ 2014-12-19 11:41 UTC (permalink / raw)
  To: openembedded-core

Add a module to help provide utility functions for dealing with recipes.
This would typically be used by external tools.

Substantial portions of this module were borrowed from the OE Layer
index code; other functions originally contributed by
Markus Lehtonen <markus.lehtonen@intel.com>.

Signed-off-by: Paul Eggleton <paul.eggleton@linux.intel.com>
---
 meta/lib/oe/recipeutils.py | 279 +++++++++++++++++++++++++++++++++++++++++++++
 1 file changed, 279 insertions(+)
 create mode 100644 meta/lib/oe/recipeutils.py

diff --git a/meta/lib/oe/recipeutils.py b/meta/lib/oe/recipeutils.py
new file mode 100644
index 0000000..1758dce
--- /dev/null
+++ b/meta/lib/oe/recipeutils.py
@@ -0,0 +1,279 @@
+# Utility functions for reading and modifying recipes
+#
+# Some code borrowed from the OE layer index
+#
+# Copyright (C) 2013-2014 Intel Corporation
+#
+
+import sys
+import os
+import os.path
+import tempfile
+import textwrap
+import difflib
+import utils
+import shutil
+import re
+from collections import OrderedDict, defaultdict
+
+
+# Help us to find places to insert values
+recipe_progression = ['SUMMARY', 'DESCRIPTION', 'HOMEPAGE', 'BUGTRACKER', 'SECTION', 'LICENSE', 'LIC_FILES_CHKSUM', 'PROVIDES', 'DEPENDS', 'PR', 'PV', 'SRC_URI', 'do_fetch', 'do_unpack', 'do_patch', 'EXTRA_OECONF', 'do_configure', 'EXTRA_OEMAKE', 'do_compile', 'do_install', 'do_populate_sysroot', 'INITSCRIPT', 'USERADD', 'GROUPADD', 'PACKAGES', 'FILES', 'RDEPENDS', 'RRECOMMENDS', 'RSUGGESTS', 'RPROVIDES', 'RREPLACES', 'RCONFLICTS', 'ALLOW_EMPTY', 'do_package', 'do_deploy']
+# Variables that sometimes are a bit long but shouldn't be wrapped
+nowrap_vars = ['SUMMARY', 'HOMEPAGE', 'BUGTRACKER']
+list_vars = ['SRC_URI', 'LIC_FILES_CHKSUM']
+meta_vars = ['SUMMARY', 'DESCRIPTION', 'HOMEPAGE', 'BUGTRACKER', 'SECTION']
+
+
+def pn_to_recipe(cooker, pn):
+    """Convert a recipe name (PN) to the path to the recipe file"""
+    import bb.providers
+
+    if pn in cooker.recipecache.pkg_pn:
+        filenames = cooker.recipecache.pkg_pn[pn]
+        best = bb.providers.findBestProvider(pn, cooker.data, cooker.recipecache, cooker.recipecache.pkg_pn)
+        return best[3]
+    else:
+        return None
+
+
+def get_unavailable_reasons(cooker, pn):
+    """If a recipe could not be found, find out why if possible"""
+    import bb.taskdata
+    taskdata = bb.taskdata.TaskData(None, skiplist=cooker.skiplist)
+    return taskdata.get_reasons(pn)
+
+
+def parse_recipe(fn, d):
+    """Parse an individual recipe"""
+    import bb.cache
+    envdata = bb.cache.Cache.loadDataFull(fn, [], d)
+    return envdata
+
+
+def get_var_files(fn, varlist, d):
+    """Find the file in which each of a list of variables is set.
+    Note: requires variable history to be enabled when parsing.
+    """
+    envdata = parse_recipe(fn, d)
+    varfiles = {}
+    for v in varlist:
+        history = envdata.varhistory.variable(v)
+        files = []
+        for event in history:
+            if 'file' in event and not 'flag' in event:
+                files.append(event['file'])
+        if files:
+            actualfile = files[-1]
+        else:
+            actualfile = None
+        varfiles[v] = actualfile
+
+    return varfiles
+
+
+def patch_recipe_file(fn, values, patch=False, relpath=''):
+    """Update or insert variable values into a recipe file (assuming you
+       have already identified the exact file you want to update.)
+       Note that some manual inspection/intervention may be required
+       since this cannot handle all situations.
+    """
+    remainingnames = {}
+    for k in values.keys():
+        remainingnames[k] = recipe_progression.index(k) if k in recipe_progression else -1
+    remainingnames = OrderedDict(sorted(remainingnames.iteritems(), key=lambda x: x[1]))
+
+    with tempfile.NamedTemporaryFile('w', delete=False) as tf:
+        def outputvalue(name):
+            rawtext = '%s = "%s"\n' % (name, values[name])
+            if name in nowrap_vars:
+                tf.write(rawtext)
+            elif name in list_vars:
+                splitvalue = values[name].split()
+                if len(splitvalue) > 1:
+                    linesplit = ' \\\n' + (' ' * (len(name) + 4))
+                    tf.write('%s = "%s%s"\n' % (name, linesplit.join(splitvalue), linesplit))
+                else:
+                    tf.write(rawtext)
+            else:
+                wrapped = textwrap.wrap(rawtext)
+                for wrapline in wrapped[:-1]:
+                    tf.write('%s \\\n' % wrapline)
+                tf.write('%s\n' % wrapped[-1])
+
+        tfn = tf.name
+        with open(fn, 'r') as f:
+            # First runthrough - find existing names (so we know not to insert based on recipe_progression)
+            # Second runthrough - make the changes
+            existingnames = []
+            for runthrough in [1, 2]:
+                currname = None
+                for line in f:
+                    if not currname:
+                        insert = False
+                        for k in remainingnames.keys():
+                            for p in recipe_progression:
+                                if re.match('^%s[ ?:=]' % p, line):
+                                    if remainingnames[k] > -1 and recipe_progression.index(p) > remainingnames[k] and runthrough > 1 and not k in existingnames:
+                                        outputvalue(k)
+                                        del remainingnames[k]
+                                    break
+                        for k in remainingnames.keys():
+                            if re.match('^%s[ ?:=]' % k, line):
+                                currname = k
+                                if runthrough == 1:
+                                    existingnames.append(k)
+                                else:
+                                    del remainingnames[k]
+                                break
+                        if currname and runthrough > 1:
+                            outputvalue(currname)
+
+                    if currname:
+                        sline = line.rstrip()
+                        if not sline.endswith('\\'):
+                            currname = None
+                        continue
+                    if runthrough > 1:
+                        tf.write(line)
+                f.seek(0)
+        if remainingnames:
+            tf.write('\n')
+            for k in remainingnames.keys():
+                outputvalue(k)
+
+    with open(tfn, 'U') as f:
+        tolines = f.readlines()
+    if patch:
+        with open(fn, 'U') as f:
+            fromlines = f.readlines()
+        relfn = os.path.relpath(fn, relpath)
+        diff = difflib.unified_diff(fromlines, tolines, 'a/%s' % relfn, 'b/%s' % relfn)
+        os.remove(tfn)
+        return diff
+    else:
+        with open(fn, 'w') as f:
+            f.writelines(tolines)
+        os.remove(tfn)
+        return None
+
+def localise_file_vars(fn, varfiles, varlist):
+    """Given a list of variables and variable history (fetched with get_var_files())
+    find where each variable should be set/changed. This handles for example where a
+    recipe includes an inc file where variables might be changed - in most cases
+    we want to update the inc file when changing the variable value rather than adding
+    it to the recipe itself.
+    """
+    fndir = os.path.dirname(fn) + os.sep
+
+    first_meta_file = None
+    for v in meta_vars:
+        f = varfiles.get(v, None)
+        if f:
+            actualdir = os.path.dirname(f) + os.sep
+            if actualdir.startswith(fndir):
+                first_meta_file = f
+                break
+
+    filevars = defaultdict(list)
+    for v in varlist:
+        f = varfiles[v]
+        # Only return files that are in the same directory as the recipe or in some directory below there
+        # (this excludes bbclass files and common inc files that wouldn't be appropriate to set the variable
+        # in if we were going to set a value specific to this recipe)
+        if f:
+            actualfile = f
+        else:
+            # Variable isn't in a file, if it's one of the "meta" vars, use the first file with a meta var in it
+            if first_meta_file:
+                actualfile = first_meta_file
+            else:
+                actualfile = fn
+
+        actualdir = os.path.dirname(actualfile) + os.sep
+        if not actualdir.startswith(fndir):
+            actualfile = fn
+        filevars[actualfile].append(v)
+
+    return filevars
+
+def patch_recipe(d, fn, varvalues, patch=False, relpath=''):
+    """Modify a list of variable values in the specified recipe. Handles inc files if
+    used by the recipe.
+    """
+    varlist = varvalues.keys()
+    varfiles = get_var_files(fn, varlist, d)
+    locs = localise_file_vars(fn, varfiles, varlist)
+    patches = []
+    for f,v in locs.iteritems():
+        vals = {k: varvalues[k] for k in v}
+        patchdata = patch_recipe_file(f, vals, patch, relpath)
+        if patch:
+            patches.append(patchdata)
+
+    if patch:
+        return patches
+    else:
+        return None
+
+
+
+def copy_recipe_files(d, tgt_dir, whole_dir=False, download=True):
+    """Copy (local) recipe files, including both files included via include/require,
+    and files referred to in the SRC_URI variable."""
+    import bb.fetch2
+    import oe.path
+
+    # FIXME need a warning if the unexpanded SRC_URI value contains variable references
+
+    uris = (d.getVar('SRC_URI', True) or "").split()
+    fetch = bb.fetch2.Fetch(uris, d)
+    if download:
+        fetch.download()
+
+    # Copy local files to target directory and gather any remote files
+    bb_dir = os.path.dirname(d.getVar('FILE', True)) + os.sep
+    remotes = []
+    includes = [path for path in d.getVar('BBINCLUDED', True).split() if
+                path.startswith(bb_dir) and os.path.exists(path)]
+    for path in fetch.localpaths() + includes:
+        # Only import files that are under the meta directory
+        if path.startswith(bb_dir):
+            if not whole_dir:
+                relpath = os.path.relpath(path, bb_dir)
+                subdir = os.path.join(tgt_dir, os.path.dirname(relpath))
+                if not os.path.exists(subdir):
+                    os.makedirs(subdir)
+                shutil.copy2(path, os.path.join(tgt_dir, relpath))
+        else:
+            remotes.append(path)
+    # Simply copy whole meta dir, if requested
+    if whole_dir:
+        shutil.copytree(bb_dir, tgt_dir)
+
+    return remotes
+
+
+def get_recipe_patches(d):
+    """Get a list of the patches included in SRC_URI within a recipe."""
+    patchfiles = []
+    # Execute src_patches() defined in patch.bbclass - this works since that class
+    # is inherited globally
+    patches = bb.utils.exec_flat_python_func('src_patches', d)
+    for patch in patches:
+        _, _, local, _, _, parm = bb.fetch.decodeurl(patch)
+        patchfiles.append(local)
+    return patchfiles
+
+
+def validate_pn(pn):
+    """Perform validation on a recipe name (PN) for a new recipe."""
+    reserved_names = ['forcevariable', 'append', 'prepend', 'remove']
+    if not re.match('[0-9a-z-]+', pn):
+        return 'Recipe name "%s" is invalid: only characters 0-9, a-z and - are allowed' % pn
+    elif pn in reserved_names:
+        return 'Recipe name "%s" is invalid: is a reserved keyword' % pn
+    elif pn.startswith('pn-'):
+        return 'Recipe name "%s" is invalid: names starting with "pn-" are reserved' % pn
+    return ''
+
-- 
1.9.3



^ permalink raw reply related	[flat|nested] 17+ messages in thread

* [PATCH 09/15] oeqa/utils: make get_bb_var() more reliable
  2014-12-19 11:41 [PATCH 00/15] Developer workflow tools Paul Eggleton
                   ` (7 preceding siblings ...)
  2014-12-19 11:41 ` [PATCH 08/15] lib/oe: add recipeutils module Paul Eggleton
@ 2014-12-19 11:41 ` Paul Eggleton
  2014-12-19 11:41 ` [PATCH 10/15] classes/externalsrc: set do_compile as nostamp Paul Eggleton
                   ` (6 subsequent siblings)
  15 siblings, 0 replies; 17+ messages in thread
From: Paul Eggleton @ 2014-12-19 11:41 UTC (permalink / raw)
  To: openembedded-core

* Enable querying exported variables
* Use strip() to remove quotes so any internal quotes are not disturbed

Signed-off-by: Paul Eggleton <paul.eggleton@linux.intel.com>
---
 meta/lib/oeqa/utils/commands.py | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/meta/lib/oeqa/utils/commands.py b/meta/lib/oeqa/utils/commands.py
index 802bc2f..d29c1b1 100644
--- a/meta/lib/oeqa/utils/commands.py
+++ b/meta/lib/oeqa/utils/commands.py
@@ -138,9 +138,9 @@ def get_bb_var(var, target=None, postconfig=None):
     val = None
     bbenv = get_bb_env(target, postconfig=postconfig)
     for line in bbenv.splitlines():
-        if line.startswith(var + "="):
+        if line.startswith(var + "=") or line.startswith("export " + var + "="):
             val = line.split('=')[1]
-            val = val.replace('\"','')
+            val = val.strip('\"')
             break
     return val
 
-- 
1.9.3



^ permalink raw reply related	[flat|nested] 17+ messages in thread

* [PATCH 10/15] classes/externalsrc: set do_compile as nostamp
  2014-12-19 11:41 [PATCH 00/15] Developer workflow tools Paul Eggleton
                   ` (8 preceding siblings ...)
  2014-12-19 11:41 ` [PATCH 09/15] oeqa/utils: make get_bb_var() more reliable Paul Eggleton
@ 2014-12-19 11:41 ` Paul Eggleton
  2014-12-19 11:41 ` [PATCH 11/15] scripts/recipetool: Add a recipe auto-creation script Paul Eggleton
                   ` (5 subsequent siblings)
  15 siblings, 0 replies; 17+ messages in thread
From: Paul Eggleton @ 2014-12-19 11:41 UTC (permalink / raw)
  To: openembedded-core

Most of the time what you want when using this class is for do_compile
to execute more than just once - every time the source changes would be
ideal, but that's a little tricky to accomplish. Thus, set do_compile as
nostamp to get something close.  Note that in order to be effective this
also requires the change to bitbake that causes nostamp task signatures
to change on each execution.

Signed-off-by: Paul Eggleton <paul.eggleton@linux.intel.com>
---
 meta/classes/externalsrc.bbclass | 3 +++
 1 file changed, 3 insertions(+)

diff --git a/meta/classes/externalsrc.bbclass b/meta/classes/externalsrc.bbclass
index 2ac6274..4e429d7 100644
--- a/meta/classes/externalsrc.bbclass
+++ b/meta/classes/externalsrc.bbclass
@@ -49,5 +49,8 @@ python () {
 
         for task in d.getVar("SRCTREECOVEREDTASKS", True).split():
             bb.build.deltask(task, d)
+
+        # Ensure compilation happens every time
+        d.setVarFlag('do_compile', 'nostamp', '1')
 }
 
-- 
1.9.3



^ permalink raw reply related	[flat|nested] 17+ messages in thread

* [PATCH 11/15] scripts/recipetool: Add a recipe auto-creation script
  2014-12-19 11:41 [PATCH 00/15] Developer workflow tools Paul Eggleton
                   ` (9 preceding siblings ...)
  2014-12-19 11:41 ` [PATCH 10/15] classes/externalsrc: set do_compile as nostamp Paul Eggleton
@ 2014-12-19 11:41 ` Paul Eggleton
  2014-12-19 11:41 ` [PATCH 12/15] scripts: add scriptutils module Paul Eggleton
                   ` (4 subsequent siblings)
  15 siblings, 0 replies; 17+ messages in thread
From: Paul Eggleton @ 2014-12-19 11:41 UTC (permalink / raw)
  To: openembedded-core

Add a more maintainable and flexible script for creating at least the
skeleton of a recipe based on an examination of the source tree.
Commands can be added and the creation process can be extended through
plugins.

[YOCTO #6406]

Signed-off-by: Paul Eggleton <paul.eggleton@linux.intel.com>
---
 scripts/lib/recipetool/__init__.py        |   0
 scripts/lib/recipetool/create.py          | 413 ++++++++++++++++++++++++++++++
 scripts/lib/recipetool/create_buildsys.py | 319 +++++++++++++++++++++++
 scripts/recipetool                        |  99 +++++++
 4 files changed, 831 insertions(+)
 create mode 100644 scripts/lib/recipetool/__init__.py
 create mode 100644 scripts/lib/recipetool/create.py
 create mode 100644 scripts/lib/recipetool/create_buildsys.py
 create mode 100755 scripts/recipetool

diff --git a/scripts/lib/recipetool/__init__.py b/scripts/lib/recipetool/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/scripts/lib/recipetool/create.py b/scripts/lib/recipetool/create.py
new file mode 100644
index 0000000..6dba078
--- /dev/null
+++ b/scripts/lib/recipetool/create.py
@@ -0,0 +1,413 @@
+# Recipe creation tool - create command plugin
+#
+# Copyright (C) 2014 Intel Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import sys
+import os
+import argparse
+import glob
+import fnmatch
+import re
+import logging
+
+logger = logging.getLogger('recipetool')
+
+tinfoil = None
+plugins = None
+
+def plugin_init(pluginlist):
+    # Take a reference to the list so we can use it later
+    global plugins
+    plugins = pluginlist
+
+def tinfoil_init(instance):
+    global tinfoil
+    tinfoil = instance
+
+class RecipeHandler():
+    @staticmethod
+    def checkfiles(path, speclist):
+        results = []
+        for spec in speclist:
+            results.extend(glob.glob(os.path.join(path, spec)))
+        return results
+
+    def genfunction(self, outlines, funcname, content):
+        outlines.append('%s () {' % funcname)
+        for line in content:
+            outlines.append('\t%s' % line)
+        outlines.append('}')
+        outlines.append('')
+
+    def process(self, srctree, classes, lines_before, lines_after, handled):
+        return False
+
+
+
+def fetch_source(uri, destdir):
+    import bb.data
+    bb.utils.mkdirhier(destdir)
+    localdata = bb.data.createCopy(tinfoil.config_data)
+    bb.data.update_data(localdata)
+    localdata.setVar('BB_STRICT_CHECKSUM', '')
+    localdata.setVar('SRCREV', '${AUTOREV}')
+    ret = (None, None)
+    olddir = os.getcwd()
+    try:
+        fetcher = bb.fetch2.Fetch([uri], localdata)
+        for u in fetcher.ud:
+            ud = fetcher.ud[u]
+            ud.ignore_checksums = True
+        fetcher.download()
+        fetcher.unpack(destdir)
+        for u in fetcher.ud:
+            ud = fetcher.ud[u]
+            if ud.method.recommends_checksum(ud):
+                md5value = bb.utils.md5_file(ud.localpath)
+                sha256value = bb.utils.sha256_file(ud.localpath)
+                ret = (md5value, sha256value)
+    except bb.fetch2.BBFetchException, e:
+        raise bb.build.FuncFailed(e)
+    finally:
+        os.chdir(olddir)
+    return ret
+
+def supports_srcrev(uri):
+    localdata = bb.data.createCopy(tinfoil.config_data)
+    bb.data.update_data(localdata)
+    fetcher = bb.fetch2.Fetch([uri], localdata)
+    urldata = fetcher.ud
+    for u in urldata:
+        if urldata[u].method.supports_srcrev():
+            return True
+    return False
+
+def create_recipe(args):
+    import bb.process
+    import tempfile
+    import shutil
+
+    pkgarch = ""
+    if args.machine:
+        pkgarch = "${MACHINE_ARCH}"
+
+    checksums = (None, None)
+    tempsrc = ''
+    srcsubdir = ''
+    if '://' in args.source:
+        # Fetch a URL
+        srcuri = args.source
+        if args.externalsrc:
+            srctree = args.externalsrc
+        else:
+            tempsrc = tempfile.mkdtemp(prefix='recipetool-')
+            srctree = tempsrc
+        logger.info('Fetching %s...' % srcuri)
+        checksums = fetch_source(args.source, srctree)
+        dirlist = os.listdir(srctree)
+        if 'git.indirectionsymlink' in dirlist:
+            dirlist.remove('git.indirectionsymlink')
+        if len(dirlist) == 1 and os.path.isdir(os.path.join(srctree, dirlist[0])):
+            # We unpacked a single directory, so we should use that
+            srcsubdir = dirlist[0]
+            srctree = os.path.join(srctree, srcsubdir)
+    else:
+        # Assume we're pointing to an existing source tree
+        if args.externalsrc:
+            logger.error('externalsrc cannot be specified if source is a directory')
+            sys.exit(1)
+        if not os.path.isdir(args.source):
+            logger.error('Invalid source directory %s' % args.source)
+            sys.exit(1)
+        srcuri = ''
+        srctree = args.source
+
+    outfile = args.outfile
+    if outfile and outfile != '-':
+        if os.path.exists(outfile):
+            logger.error('Output file %s already exists' % outfile)
+            sys.exit(1)
+
+    lines_before = []
+    lines_after = []
+
+    lines_before.append('# Recipe created by %s' % os.path.basename(sys.argv[0]))
+    lines_before.append('# This is the basis of a recipe and may need further editing in order to be fully functional.')
+    lines_before.append('# (Feel free to remove these comments when editing.)')
+    lines_before.append('#')
+
+    licvalues = guess_license(srctree)
+    lic_files_chksum = []
+    if licvalues:
+        licenses = []
+        for licvalue in licvalues:
+            if not licvalue[0] in licenses:
+                licenses.append(licvalue[0])
+            lic_files_chksum.append('file://%s;md5=%s' % (licvalue[1], licvalue[2]))
+        lines_before.append('# WARNING: the following LICENSE and LIC_FILES_CHKSUM values are best guesses - it is')
+        lines_before.append('# your responsibility to verify that the values are complete and correct.')
+        if len(licvalues) > 1:
+            lines_before.append('#')
+            lines_before.append('# NOTE: multiple licenses have been detected; if that is correct you should separate')
+            lines_before.append('# these in the LICENSE value using & if the multiple licenses all apply, or | if there')
+            lines_before.append('# is a choice between the multiple licenses. If in doubt, check the accompanying')
+            lines_before.append('# documentation to determine which situation is applicable.')
+    else:
+        lines_before.append('# Unable to find any files that looked like license statements. Check the accompanying')
+        lines_before.append('# documentation and source headers and set LICENSE and LIC_FILES_CHKSUM accordingly.')
+        lines_before.append('#')
+        lines_before.append('# NOTE: LICENSE is being set to "CLOSED" to allow you to at least start building - if')
+        lines_before.append('# this is not accurate with respect to the licensing of the software being built (it')
+        lines_before.append('# will not be in most cases) you must specify the correct value before using this')
+        lines_before.append('# recipe for anything other than initial testing/development!')
+        licenses = ['CLOSED']
+    lines_before.append('LICENSE = "%s"' % ' '.join(licenses))
+    lines_before.append('LIC_FILES_CHKSUM = "%s"' % ' \\\n                    '.join(lic_files_chksum))
+    lines_before.append('')
+
+    # FIXME This is kind of a hack, we probably ought to be using bitbake to do this
+    # we'd also want a way to automatically set outfile based upon auto-detecting these values from the source if possible
+    recipefn = os.path.splitext(os.path.basename(outfile))[0]
+    fnsplit = recipefn.split('_')
+    if len(fnsplit) > 1:
+        pn = fnsplit[0]
+        pv = fnsplit[1]
+    else:
+        pn = recipefn
+        pv = None
+
+    if srcuri:
+        if pv and pv not in 'git svn hg'.split():
+            srcuri = srcuri.replace(pv, '${PV}')
+    else:
+        lines_before.append('# No information for SRC_URI yet (only an external source tree was specified)')
+    lines_before.append('SRC_URI = "%s"' % srcuri)
+    (md5value, sha256value) = checksums
+    if md5value:
+        lines_before.append('SRC_URI[md5sum] = "%s"' % md5value)
+    if sha256value:
+        lines_before.append('SRC_URI[sha256sum] = "%s"' % sha256value)
+    if srcuri and supports_srcrev(srcuri):
+        lines_before.append('')
+        lines_before.append('# Modify these as desired')
+        lines_before.append('PV = "1.0+git${SRCPV}"')
+        lines_before.append('SRCREV = "${AUTOREV}"')
+    lines_before.append('')
+
+    if srcsubdir and pv:
+        if srcsubdir == "%s-%s" % (pn, pv):
+            # This would be the default, so we don't need to set S in the recipe
+            srcsubdir = ''
+    if srcsubdir:
+        lines_before.append('S = "${WORKDIR}/%s"' % srcsubdir)
+        lines_before.append('')
+
+    if pkgarch:
+        lines_after.append('PACKAGE_ARCH = "%s"' % pkgarch)
+        lines_after.append('')
+
+    # Find all plugins that want to register handlers
+    handlers = []
+    for plugin in plugins:
+        if hasattr(plugin, 'register_recipe_handlers'):
+            plugin.register_recipe_handlers(handlers)
+
+    # Apply the handlers
+    classes = []
+    handled = []
+    for handler in handlers:
+        handler.process(srctree, classes, lines_before, lines_after, handled)
+
+    outlines = []
+    outlines.extend(lines_before)
+    if classes:
+        outlines.append('inherit %s' % ' '.join(classes))
+        outlines.append('')
+    outlines.extend(lines_after)
+
+    if outfile == '-':
+        sys.stdout.write('\n'.join(outlines) + '\n')
+    else:
+        with open(outfile, 'w') as f:
+            f.write('\n'.join(outlines) + '\n')
+        logger.info('Recipe %s has been created; further editing may be required to make it fully functional' % outfile)
+
+    if tempsrc:
+        shutil.rmtree(tempsrc)
+
+    return 0
+
+def get_license_md5sums(d, static_only=False):
+    import bb.utils
+    md5sums = {}
+    if not static_only:
+        # Gather md5sums of license files in common license dir
+        commonlicdir = d.getVar('COMMON_LICENSE_DIR', True)
+        for fn in os.listdir(commonlicdir):
+            md5value = bb.utils.md5_file(os.path.join(commonlicdir, fn))
+            md5sums[md5value] = fn
+    # The following were extracted from common values in various recipes
+    # (double checking the license against the license file itself, not just
+    # the LICENSE value in the recipe)
+    md5sums['94d55d512a9ba36caa9b7df079bae19f'] = 'GPLv2'
+    md5sums['b234ee4d69f5fce4486a80fdaf4a4263'] = 'GPLv2'
+    md5sums['59530bdf33659b29e73d4adb9f9f6552'] = 'GPLv2'
+    md5sums['0636e73ff0215e8d672dc4c32c317bb3'] = 'GPLv2'
+    md5sums['eb723b61539feef013de476e68b5c50a'] = 'GPLv2'
+    md5sums['751419260aa954499f7abaabaa882bbe'] = 'GPLv2'
+    md5sums['393a5ca445f6965873eca0259a17f833'] = 'GPLv2'
+    md5sums['12f884d2ae1ff87c09e5b7ccc2c4ca7e'] = 'GPLv2'
+    md5sums['8ca43cbc842c2336e835926c2166c28b'] = 'GPLv2'
+    md5sums['ebb5c50ab7cab4baeffba14977030c07'] = 'GPLv2'
+    md5sums['c93c0550bd3173f4504b2cbd8991e50b'] = 'GPLv2'
+    md5sums['9ac2e7cff1ddaf48b6eab6028f23ef88'] = 'GPLv2'
+    md5sums['4325afd396febcb659c36b49533135d4'] = 'GPLv2'
+    md5sums['18810669f13b87348459e611d31ab760'] = 'GPLv2'
+    md5sums['d7810fab7487fb0aad327b76f1be7cd7'] = 'GPLv2' # the Linux kernel's COPYING file
+    md5sums['bbb461211a33b134d42ed5ee802b37ff'] = 'LGPLv2.1'
+    md5sums['7fbc338309ac38fefcd64b04bb903e34'] = 'LGPLv2.1'
+    md5sums['4fbd65380cdd255951079008b364516c'] = 'LGPLv2.1'
+    md5sums['2d5025d4aa3495befef8f17206a5b0a1'] = 'LGPLv2.1'
+    md5sums['fbc093901857fcd118f065f900982c24'] = 'LGPLv2.1'
+    md5sums['a6f89e2100d9b6cdffcea4f398e37343'] = 'LGPLv2.1'
+    md5sums['d8045f3b8f929c1cb29a1e3fd737b499'] = 'LGPLv2.1'
+    md5sums['fad9b3332be894bab9bc501572864b29'] = 'LGPLv2.1'
+    md5sums['3bf50002aefd002f49e7bb854063f7e7'] = 'LGPLv2'
+    md5sums['9f604d8a4f8e74f4f5140845a21b6674'] = 'LGPLv2'
+    md5sums['5f30f0716dfdd0d91eb439ebec522ec2'] = 'LGPLv2'
+    md5sums['55ca817ccb7d5b5b66355690e9abc605'] = 'LGPLv2'
+    md5sums['252890d9eee26aab7b432e8b8a616475'] = 'LGPLv2'
+    md5sums['d32239bcb673463ab874e80d47fae504'] = 'GPLv3'
+    md5sums['f27defe1e96c2e1ecd4e0c9be8967949'] = 'GPLv3'
+    md5sums['6a6a8e020838b23406c81b19c1d46df6'] = 'LGPLv3'
+    md5sums['3b83ef96387f14655fc854ddc3c6bd57'] = 'Apache-2.0'
+    md5sums['385c55653886acac3821999a3ccd17b3'] = 'Artistic-1.0 | GPL-2.0' # some perl modules
+    return md5sums
+
+def guess_license(srctree):
+    import bb
+    md5sums = get_license_md5sums(tinfoil.config_data)
+
+    licenses = []
+    licspecs = ['LICENSE*', 'COPYING*', '*[Ll]icense*', 'LICENCE*', 'LEGAL*', '[Ll]egal*', '*GPL*', 'README.lic*', 'COPYRIGHT*', '[Cc]opyright*']
+    licfiles = []
+    for root, dirs, files in os.walk(srctree):
+        for fn in files:
+            for spec in licspecs:
+                if fnmatch.fnmatch(fn, spec):
+                    licfiles.append(os.path.join(root, fn))
+    for licfile in licfiles:
+        md5value = bb.utils.md5_file(licfile)
+        license = md5sums.get(md5value, 'Unknown')
+        licenses.append((license, os.path.relpath(licfile, srctree), md5value))
+
+    # FIXME should we grab at least one source file with a license header and add that too?
+
+    return licenses
+
+def read_pkgconfig_provides(d):
+    pkgdatadir = d.getVar('PKGDATA_DIR', True)
+    pkgmap = {}
+    for fn in glob.glob(os.path.join(pkgdatadir, 'shlibs2', '*.pclist')):
+        with open(fn, 'r') as f:
+            for line in f:
+                pkgmap[os.path.basename(line.rstrip())] = os.path.splitext(os.path.basename(fn))[0]
+    recipemap = {}
+    for pc, pkg in pkgmap.iteritems():
+        pkgdatafile = os.path.join(pkgdatadir, 'runtime', pkg)
+        if os.path.exists(pkgdatafile):
+            with open(pkgdatafile, 'r') as f:
+                for line in f:
+                    if line.startswith('PN: '):
+                        recipemap[pc] = line.split(':', 1)[1].strip()
+    return recipemap
+
+def convert_pkginfo(pkginfofile):
+    values = {}
+    with open(pkginfofile, 'r') as f:
+        indesc = False
+        for line in f:
+            if indesc:
+                if line.strip():
+                    values['DESCRIPTION'] += ' ' + line.strip()
+                else:
+                    indesc = False
+            else:
+                splitline = line.split(': ', 1)
+                key = line[0]
+                value = line[1]
+                if key == 'LICENSE':
+                    for dep in value.split(','):
+                        dep = dep.split()[0]
+                        mapped = depmap.get(dep, '')
+                        if mapped:
+                            depends.append(mapped)
+                elif key == 'License':
+                    values['LICENSE'] = value
+                elif key == 'Summary':
+                    values['SUMMARY'] = value
+                elif key == 'Description':
+                    values['DESCRIPTION'] = value
+                    indesc = True
+    return values
+
+def convert_debian(debpath):
+    # FIXME extend this mapping - perhaps use distro_alias.inc?
+    depmap = {'libz-dev': 'zlib'}
+
+    values = {}
+    depends = []
+    with open(os.path.join(debpath, 'control')) as f:
+        indesc = False
+        for line in f:
+            if indesc:
+                if line.strip():
+                    if line.startswith(' This package contains'):
+                        indesc = False
+                    else:
+                        values['DESCRIPTION'] += ' ' + line.strip()
+                else:
+                    indesc = False
+            else:
+                splitline = line.split(':', 1)
+                key = line[0]
+                value = line[1]
+                if key == 'Build-Depends':
+                    for dep in value.split(','):
+                        dep = dep.split()[0]
+                        mapped = depmap.get(dep, '')
+                        if mapped:
+                            depends.append(mapped)
+                elif key == 'Section':
+                    values['SECTION'] = value
+                elif key == 'Description':
+                    values['SUMMARY'] = value
+                    indesc = True
+
+    if depends:
+        values['DEPENDS'] = ' '.join(depends)
+
+    return values
+
+
+def register_command(subparsers):
+    parser_create = subparsers.add_parser('create', help='Create a new recipe')
+    parser_create.add_argument('source', help='Path or URL to source')
+    parser_create.add_argument('-o', '--outfile', help='Full path and filename to recipe to add', required=True)
+    parser_create.add_argument('-m', '--machine', help='Make recipe machine-specific as opposed to architecture-specific', action='store_true')
+    parser_create.add_argument('-x', '--externalsrc', help='Assuming source is a URL, fetch it and extract it to the specified directory')
+    parser_create.set_defaults(func=create_recipe)
+
diff --git a/scripts/lib/recipetool/create_buildsys.py b/scripts/lib/recipetool/create_buildsys.py
new file mode 100644
index 0000000..6c9e0ef
--- /dev/null
+++ b/scripts/lib/recipetool/create_buildsys.py
@@ -0,0 +1,319 @@
+# Recipe creation tool - create command build system handlers
+#
+# Copyright (C) 2014 Intel Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import re
+import logging
+from recipetool.create import RecipeHandler, read_pkgconfig_provides
+
+logger = logging.getLogger('recipetool')
+
+tinfoil = None
+
+def tinfoil_init(instance):
+    global tinfoil
+    tinfoil = instance
+
+class CmakeRecipeHandler(RecipeHandler):
+    def process(self, srctree, classes, lines_before, lines_after, handled):
+        if 'buildsystem' in handled:
+            return False
+
+        if RecipeHandler.checkfiles(srctree, ['CMakeLists.txt']):
+            classes.append('cmake')
+            lines_after.append('# Specify any options you want to pass to cmake using EXTRA_OECMAKE:')
+            lines_after.append('EXTRA_OECMAKE = ""')
+            lines_after.append('')
+            handled.append('buildsystem')
+            return True
+        return False
+
+class SconsRecipeHandler(RecipeHandler):
+    def process(self, srctree, classes, lines_before, lines_after, handled):
+        if 'buildsystem' in handled:
+            return False
+
+        if RecipeHandler.checkfiles(srctree, ['SConstruct', 'Sconstruct', 'sconstruct']):
+            classes.append('scons')
+            lines_after.append('# Specify any options you want to pass to scons using EXTRA_OESCONS:')
+            lines_after.append('EXTRA_OESCONS = ""')
+            lines_after.append('')
+            handled.append('buildsystem')
+            return True
+        return False
+
+class QmakeRecipeHandler(RecipeHandler):
+    def process(self, srctree, classes, lines_before, lines_after, handled):
+        if 'buildsystem' in handled:
+            return False
+
+        if RecipeHandler.checkfiles(srctree, ['*.pro']):
+            classes.append('qmake2')
+            handled.append('buildsystem')
+            return True
+        return False
+
+class AutotoolsRecipeHandler(RecipeHandler):
+    def process(self, srctree, classes, lines_before, lines_after, handled):
+        if 'buildsystem' in handled:
+            return False
+
+        autoconf = False
+        if RecipeHandler.checkfiles(srctree, ['configure.ac', 'configure.in']):
+            autoconf = True
+            values = AutotoolsRecipeHandler.extract_autotools_deps(lines_before, srctree)
+            classes.extend(values.pop('inherit', '').split())
+            for var, value in values.iteritems():
+                lines_before.append('%s = "%s"' % (var, value))
+        else:
+            conffile = RecipeHandler.checkfiles(srctree, ['configure'])
+            if conffile:
+                # Check if this is just a pre-generated autoconf configure script
+                with open(conffile[0], 'r') as f:
+                    for i in range(1, 10):
+                        if 'Generated by GNU Autoconf' in f.readline():
+                            autoconf = True
+                            break
+
+        if autoconf:
+            lines_before.append('# NOTE: if this software is not capable of being built in a separate build directory')
+            lines_before.append('# from the source, you should replace autotools with autotools-brokensep in the')
+            lines_before.append('# inherit line')
+            classes.append('autotools')
+            lines_after.append('# Specify any options you want to pass to the configure script using EXTRA_OECONF:')
+            lines_after.append('EXTRA_OECONF = ""')
+            lines_after.append('')
+            handled.append('buildsystem')
+            return True
+
+        return False
+
+    @staticmethod
+    def extract_autotools_deps(outlines, srctree, acfile=None):
+        import shlex
+        import oe.package
+
+        values = {}
+        inherits = []
+
+        # FIXME this mapping is very thin
+        progmap = {'flex': 'flex-native',
+                'bison': 'bison-native',
+                'm4': 'm4-native'}
+        progclassmap = {'gconftool-2': 'gconf',
+                'pkg-config': 'pkgconfig'}
+
+        ignoredeps = ['gcc-runtime', 'glibc', 'uclibc']
+
+        pkg_re = re.compile('PKG_CHECK_MODULES\(\[?[a-zA-Z0-9]*\]?, \[?([^,\]]*)[),].*')
+        lib_re = re.compile('AC_CHECK_LIB\(\[?([a-zA-Z0-9]*)\]?, .*')
+        progs_re = re.compile('_PROGS?\(\[?[a-zA-Z0-9]*\]?, \[?([^,\]]*)\]?[),].*')
+        dep_re = re.compile('([^ ><=]+)( [<>=]+ [^ ><=]+)?')
+
+        # Build up lib library->package mapping
+        shlib_providers = oe.package.read_shlib_providers(tinfoil.config_data)
+        libdir = tinfoil.config_data.getVar('libdir', True)
+        base_libdir = tinfoil.config_data.getVar('base_libdir', True)
+        libpaths = list(set([base_libdir, libdir]))
+        libname_re = re.compile('^lib(.+)\.so.*$')
+        pkglibmap = {}
+        for lib, item in shlib_providers.iteritems():
+            for path, pkg in item.iteritems():
+                if path in libpaths:
+                    res = libname_re.match(lib)
+                    if res:
+                        libname = res.group(1)
+                        if not libname in pkglibmap:
+                            pkglibmap[libname] = pkg[0]
+                    else:
+                        logger.debug('unable to extract library name from %s' % lib)
+
+        # Now turn it into a library->recipe mapping
+        recipelibmap = {}
+        pkgdata_dir = tinfoil.config_data.getVar('PKGDATA_DIR', True)
+        for libname, pkg in pkglibmap.iteritems():
+            try:
+                with open(os.path.join(pkgdata_dir, 'runtime', pkg)) as f:
+                    for line in f:
+                        if line.startswith('PN:'):
+                            recipelibmap[libname] = line.split(':', 1)[-1].strip()
+                            break
+            except IOError as ioe:
+                if ioe.errno == 2:
+                    logger.warn('unable to find a pkgdata file for package %s' % pkg)
+                else:
+                    raise
+
+        # Since a configure.ac file is essentially a program, this is only ever going to be
+        # a hack unfortunately; but it ought to be enough of an approximation
+        if acfile:
+            srcfiles = [acfile]
+        else:
+            srcfiles = RecipeHandler.checkfiles(srctree, ['configure.ac', 'configure.in'])
+        pcdeps = []
+        deps = []
+        unmapped = []
+        unmappedlibs = []
+        with open(srcfiles[0], 'r') as f:
+            for line in f:
+                if 'PKG_CHECK_MODULES' in line:
+                    res = pkg_re.search(line)
+                    if res:
+                        res = dep_re.findall(res.group(1))
+                        if res:
+                            pcdeps.extend([x[0] for x in res])
+                    inherits.append('pkgconfig')
+                if line.lstrip().startswith('AM_GNU_GETTEXT'):
+                    inherits.append('gettext')
+                elif 'AC_CHECK_PROG' in line or 'AC_PATH_PROG' in line:
+                    res = progs_re.search(line)
+                    if res:
+                        for prog in shlex.split(res.group(1)):
+                            prog = prog.split()[0]
+                            progclass = progclassmap.get(prog, None)
+                            if progclass:
+                                inherits.append(progclass)
+                            else:
+                                progdep = progmap.get(prog, None)
+                                if progdep:
+                                    deps.append(progdep)
+                                else:
+                                    if not prog.startswith('$'):
+                                        unmapped.append(prog)
+                elif 'AC_CHECK_LIB' in line:
+                    res = lib_re.search(line)
+                    if res:
+                        lib = res.group(1)
+                        libdep = recipelibmap.get(lib, None)
+                        if libdep:
+                            deps.append(libdep)
+                        else:
+                            if libdep is None:
+                                if not lib.startswith('$'):
+                                    unmappedlibs.append(lib)
+                elif 'AC_PATH_X' in line:
+                    deps.append('libx11')
+
+        if unmapped:
+            outlines.append('# NOTE: the following prog dependencies are unknown, ignoring: %s' % ' '.join(unmapped))
+
+        if unmappedlibs:
+            outlines.append('# NOTE: the following library dependencies are unknown, ignoring: %s' % ' '.join(unmappedlibs))
+            outlines.append('#       (this is based on recipes that have previously been built and packaged)')
+
+        recipemap = read_pkgconfig_provides(tinfoil.config_data)
+        unmapped = []
+        for pcdep in pcdeps:
+            recipe = recipemap.get(pcdep, None)
+            if recipe:
+                deps.append(recipe)
+            else:
+                if not pcdep.startswith('$'):
+                    unmapped.append(pcdep)
+
+        deps = set(deps).difference(set(ignoredeps))
+
+        if unmapped:
+            outlines.append('# NOTE: unable to map the following pkg-config dependencies: %s' % ' '.join(unmapped))
+            outlines.append('#       (this is based on recipes that have previously been built and packaged)')
+
+        if deps:
+            values['DEPENDS'] = ' '.join(deps)
+
+        if inherits:
+            values['inherit'] = ' '.join(list(set(inherits)))
+
+        return values
+
+
+class MakefileRecipeHandler(RecipeHandler):
+    def process(self, srctree, classes, lines_before, lines_after, handled):
+        if 'buildsystem' in handled:
+            return False
+
+        makefile = RecipeHandler.checkfiles(srctree, ['Makefile'])
+        if makefile:
+            lines_after.append('# NOTE: this is a Makefile-only piece of software, so we cannot generate much of the')
+            lines_after.append('# recipe automatically - you will need to examine the Makefile yourself and ensure')
+            lines_after.append('# that the appropriate arguments are passed in.')
+            lines_after.append('')
+
+            scanfile = os.path.join(srctree, 'configure.scan')
+            skipscan = False
+            try:
+                stdout, stderr = bb.process.run('autoscan', cwd=srctree, shell=True)
+            except bb.process.ExecutionError as e:
+                skipscan = True
+            if scanfile and os.path.exists(scanfile):
+                values = AutotoolsRecipeHandler.extract_autotools_deps(lines_before, srctree, acfile=scanfile)
+                classes.extend(values.pop('inherit', '').split())
+                for var, value in values.iteritems():
+                    if var == 'DEPENDS':
+                        lines_before.append('# NOTE: some of these dependencies may be optional, check the Makefile and/or upstream documentation')
+                    lines_before.append('%s = "%s"' % (var, value))
+                lines_before.append('')
+                for f in ['configure.scan', 'autoscan.log']:
+                    fp = os.path.join(srctree, f)
+                    if os.path.exists(fp):
+                        os.remove(fp)
+
+            self.genfunction(lines_after, 'do_configure', ['# Specify any needed configure commands here'])
+
+            func = []
+            func.append('# You will almost certainly need to add additional arguments here')
+            func.append('oe_runmake')
+            self.genfunction(lines_after, 'do_compile', func)
+
+            installtarget = True
+            try:
+                stdout, stderr = bb.process.run('make -qn install', cwd=srctree, shell=True)
+            except bb.process.ExecutionError as e:
+                if e.exitcode != 1:
+                    installtarget = False
+            func = []
+            if installtarget:
+                func.append('# This is a guess; additional arguments may be required')
+                makeargs = ''
+                with open(makefile[0], 'r') as f:
+                    for i in range(1, 100):
+                        if 'DESTDIR' in f.readline():
+                            makeargs += " 'DESTDIR=${D}'"
+                            break
+                func.append('oe_runmake install%s' % makeargs)
+            else:
+                func.append('# NOTE: unable to determine what to put here - there is a Makefile but no')
+                func.append('# target named "install", so you will need to define this yourself')
+            self.genfunction(lines_after, 'do_install', func)
+
+            handled.append('buildsystem')
+        else:
+            lines_after.append('# NOTE: no Makefile found, unable to determine what needs to be done')
+            lines_after.append('')
+            self.genfunction(lines_after, 'do_configure', ['# Specify any needed configure commands here'])
+            self.genfunction(lines_after, 'do_compile', ['# Specify compilation commands here'])
+            self.genfunction(lines_after, 'do_install', ['# Specify install commands here'])
+
+
+def plugin_init(pluginlist):
+    pass
+
+def register_recipe_handlers(handlers):
+    # These are in a specific order so that the right one is detected first
+    handlers.append(CmakeRecipeHandler())
+    handlers.append(AutotoolsRecipeHandler())
+    handlers.append(SconsRecipeHandler())
+    handlers.append(QmakeRecipeHandler())
+    handlers.append(MakefileRecipeHandler())
diff --git a/scripts/recipetool b/scripts/recipetool
new file mode 100755
index 0000000..70e6b6c
--- /dev/null
+++ b/scripts/recipetool
@@ -0,0 +1,99 @@
+#!/usr/bin/env python
+
+# Recipe creation tool
+#
+# Copyright (C) 2014 Intel Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import sys
+import os
+import argparse
+import glob
+import logging
+
+scripts_path = os.path.dirname(os.path.realpath(__file__))
+lib_path = scripts_path + '/lib'
+sys.path = sys.path + [lib_path]
+import scriptutils
+logger = scriptutils.logger_create('recipetool')
+
+plugins = []
+
+def tinfoil_init():
+    import bb.tinfoil
+    import logging
+    tinfoil = bb.tinfoil.Tinfoil()
+    tinfoil.prepare(True)
+
+    for plugin in plugins:
+        if hasattr(plugin, 'tinfoil_init'):
+            plugin.tinfoil_init(tinfoil)
+    tinfoil.logger.setLevel(logging.WARNING)
+
+def main():
+
+    if not os.environ.get('BUILDDIR', ''):
+        logger.error("This script can only be run after initialising the build environment (e.g. by using oe-init-build-env)")
+        sys.exit(1)
+
+    parser = argparse.ArgumentParser(description="OpenEmbedded recipe tool",
+                                     epilog="Use %(prog)s <command> --help to get help on a specific command")
+    parser.add_argument('-d', '--debug', help='Enable debug output', action='store_true')
+    parser.add_argument('-q', '--quiet', help='Print only errors', action='store_true')
+    parser.add_argument('--color', help='Colorize output', choices=['auto', 'always', 'never'], default='auto')
+    subparsers = parser.add_subparsers()
+
+    scriptutils.load_plugins(logger, plugins, os.path.join(scripts_path, 'lib', 'recipetool'))
+    registered = False
+    for plugin in plugins:
+        if hasattr(plugin, 'register_command'):
+            registered = True
+            plugin.register_command(subparsers)
+
+    if not registered:
+        logger.error("No commands registered - missing plugins?")
+        sys.exit(1)
+
+    args = parser.parse_args()
+
+    if args.debug:
+        logger.setLevel(logging.DEBUG)
+    elif args.quiet:
+        logger.setLevel(logging.ERROR)
+
+    import scriptpath
+    bitbakepath = scriptpath.add_bitbake_lib_path()
+    if not bitbakepath:
+        logger.error("Unable to find bitbake by searching parent directory of this script or PATH")
+        sys.exit(1)
+    logger.debug('Found bitbake path: %s' % bitbakepath)
+
+    scriptutils.logger_setup_color(logger, args.color)
+
+    tinfoil_init()
+
+    ret = args.func(args)
+
+    return ret
+
+
+if __name__ == "__main__":
+    try:
+        ret = main()
+    except Exception:
+        ret = 1
+        import traceback
+        traceback.print_exc(5)
+    sys.exit(ret)
-- 
1.9.3



^ permalink raw reply related	[flat|nested] 17+ messages in thread

* [PATCH 12/15] scripts: add scriptutils module
  2014-12-19 11:41 [PATCH 00/15] Developer workflow tools Paul Eggleton
                   ` (10 preceding siblings ...)
  2014-12-19 11:41 ` [PATCH 11/15] scripts/recipetool: Add a recipe auto-creation script Paul Eggleton
@ 2014-12-19 11:41 ` Paul Eggleton
  2014-12-19 11:41 ` [PATCH 13/15] scripts/devtool: add development helper tool Paul Eggleton
                   ` (3 subsequent siblings)
  15 siblings, 0 replies; 17+ messages in thread
From: Paul Eggleton @ 2014-12-19 11:41 UTC (permalink / raw)
  To: openembedded-core

Add a utility module for scripts. This is intended to provide functions
only really useful before bitbake has been found (or only of particular
interest to scripts). At the moment this includes functions for setting
up a logger and for loading plugins.

Signed-off-by: Paul Eggleton <paul.eggleton@linux.intel.com>
---
 scripts/lib/scriptutils.py | 60 ++++++++++++++++++++++++++++++++++++++++++++++
 1 file changed, 60 insertions(+)
 create mode 100644 scripts/lib/scriptutils.py

diff --git a/scripts/lib/scriptutils.py b/scripts/lib/scriptutils.py
new file mode 100644
index 0000000..e786126
--- /dev/null
+++ b/scripts/lib/scriptutils.py
@@ -0,0 +1,60 @@
+# Script utility functions
+#
+# Copyright (C) 2014 Intel Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import sys
+import os
+import logging
+import glob
+
+def logger_create(name):
+    logger = logging.getLogger(name)
+    loggerhandler = logging.StreamHandler()
+    loggerhandler.setFormatter(logging.Formatter("%(levelname)s: %(message)s"))
+    logger.addHandler(loggerhandler)
+    logger.setLevel(logging.INFO)
+    return logger
+
+def logger_setup_color(logger, color='auto'):
+    from bb.msg import BBLogFormatter
+    console = logging.StreamHandler(sys.stdout)
+    formatter = BBLogFormatter("%(levelname)s: %(message)s")
+    console.setFormatter(formatter)
+    logger.handlers = [console]
+    if color == 'always' or (color=='auto' and console.stream.isatty()):
+        formatter.enable_color()
+
+
+def load_plugins(logger, plugins, pluginpath):
+    import imp
+
+    def load_plugin(name):
+        logger.debug('Loading plugin %s' % name)
+        fp, pathname, description = imp.find_module(name, [pluginpath])
+        try:
+            return imp.load_module(name, fp, pathname, description)
+        finally:
+            if fp:
+                fp.close()
+
+    logger.debug('Loading plugins from %s...' % pluginpath)
+    for fn in glob.glob(os.path.join(pluginpath, '*.py')):
+        name = os.path.splitext(os.path.basename(fn))[0]
+        if name != '__init__':
+            plugin = load_plugin(name)
+            if hasattr(plugin, 'plugin_init'):
+                plugin.plugin_init(plugins)
+                plugins.append(plugin)
-- 
1.9.3



^ permalink raw reply related	[flat|nested] 17+ messages in thread

* [PATCH 13/15] scripts/devtool: add development helper tool
  2014-12-19 11:41 [PATCH 00/15] Developer workflow tools Paul Eggleton
                   ` (11 preceding siblings ...)
  2014-12-19 11:41 ` [PATCH 12/15] scripts: add scriptutils module Paul Eggleton
@ 2014-12-19 11:41 ` Paul Eggleton
  2014-12-19 11:41 ` [PATCH 14/15] scripts/devtool: Support deploy/undeploy function Paul Eggleton
                   ` (2 subsequent siblings)
  15 siblings, 0 replies; 17+ messages in thread
From: Paul Eggleton @ 2014-12-19 11:41 UTC (permalink / raw)
  To: openembedded-core

Provides an easy means to work on developing applications and system
components with the build system.

For example to "modify" the source for an existing recipe:

  $ devtool modify -x pango /home/projects/pango
  Parsing recipes..done.
  NOTE: Fetching pango...
  NOTE: Unpacking...
  NOTE: Patching...
  NOTE: Source tree extracted to /home/projects/pango
  NOTE: Recipe pango now set up to build from /home/paul/projects/pango

The pango source is now extracted to /home/paul/projects/pango, managed
in git, with each patch as a commit, and a bbappend is created in the
workspace layer to use the source in /home/paul/projects/pango when
building.

Additionally, you can add a new piece of software:

  $ devtool add pv /home/projects/pv
  NOTE: Recipe /path/to/workspace/recipes/pv/pv.bb has been
  automatically created; further editing may be required to make it
  fully functional

The latter uses recipetool to create a skeleton recipe and again sets up
a bbappend to use the source in /home/projects/pv when building.

Having done a "devtool modify", can also write any changes to the
external git repository back as patches next to the recipe:

  $ devtool update-recipe mdadm
  Parsing recipes..done.
  NOTE: Removing patch mdadm-3.2.2_fix_for_x32.patch
  NOTE: Removing patch gcc-4.9.patch
  NOTE: Updating recipe mdadm_3.3.1.bb

[YOCTO #6561]
[YOCTO #6653]
[YOCTO #6656]

Signed-off-by: Paul Eggleton <paul.eggleton@linux.intel.com>
---
 scripts/devtool                 | 255 +++++++++++++++++++
 scripts/lib/devtool/__init__.py |  78 ++++++
 scripts/lib/devtool/standard.py | 545 ++++++++++++++++++++++++++++++++++++++++
 3 files changed, 878 insertions(+)
 create mode 100755 scripts/devtool
 create mode 100644 scripts/lib/devtool/__init__.py
 create mode 100644 scripts/lib/devtool/standard.py

diff --git a/scripts/devtool b/scripts/devtool
new file mode 100755
index 0000000..d6e1b97
--- /dev/null
+++ b/scripts/devtool
@@ -0,0 +1,255 @@
+#!/usr/bin/env python
+
+# OpenEmbedded Development tool
+#
+# Copyright (C) 2014 Intel Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import sys
+import os
+import argparse
+import glob
+import re
+import ConfigParser
+import subprocess
+import logging
+
+basepath = ''
+workspace = {}
+config = None
+context = None
+
+
+scripts_path = os.path.dirname(os.path.realpath(__file__))
+lib_path = scripts_path + '/lib'
+sys.path = sys.path + [lib_path]
+import scriptutils
+logger = scriptutils.logger_create('devtool')
+
+plugins = []
+
+
+class ConfigHandler(object):
+    config_file = ''
+    config_obj = None
+    init_path = ''
+    workspace_path = ''
+
+    def __init__(self, filename):
+        self.config_file = filename
+        self.config_obj = ConfigParser.SafeConfigParser()
+
+    def get(self, section, option, default=None):
+        try:
+            ret = self.config_obj.get(section, option)
+        except (ConfigParser.NoOptionError, ConfigParser.NoSectionError):
+            if default != None:
+                ret = default
+            else:
+                raise
+        return ret
+
+    def read(self):
+        if os.path.exists(self.config_file):
+            self.config_obj.read(self.config_file)
+
+            if self.config_obj.has_option('General', 'init_path'):
+                pth = self.get('General', 'init_path')
+                self.init_path = os.path.join(basepath, pth)
+                if not os.path.exists(self.init_path):
+                    logger.error('init_path %s specified in config file cannot be found' % pth)
+                    return False
+        else:
+            self.config_obj.add_section('General')
+
+        self.workspace_path = self.get('General', 'workspace_path', os.path.join(basepath, 'workspace'))
+        return True
+
+
+    def write(self):
+        logger.debug('writing to config file %s' % self.config_file)
+        self.config_obj.set('General', 'workspace_path', self.workspace_path)
+        with open(self.config_file, 'w') as f:
+            self.config_obj.write(f)
+
+class Context:
+    def __init__(self, **kwargs):
+        self.__dict__.update(kwargs)
+
+
+def read_workspace():
+    global workspace
+    workspace = {}
+    if not os.path.exists(os.path.join(config.workspace_path, 'conf', 'layer.conf')):
+        if context.fixed_setup:
+            logger.error("workspace layer not set up")
+            sys.exit(1)
+        else:
+            logger.info('Creating workspace layer in %s' % config.workspace_path)
+            _create_workspace(config.workspace_path, config, basepath)
+
+    logger.debug('Reading workspace in %s' % config.workspace_path)
+    externalsrc_re = re.compile(r'^EXTERNALSRC(_pn-[a-zA-Z0-9-]*)? =.*$')
+    for fn in glob.glob(os.path.join(config.workspace_path, 'appends', '*.bbappend')):
+        pn = os.path.splitext(os.path.basename(fn))[0].split('_')[0]
+        with open(fn, 'r') as f:
+            for line in f:
+                if externalsrc_re.match(line.rstrip()):
+                    splitval = line.split('=', 2)
+                    workspace[pn] = splitval[1].strip('" \n\r\t')
+                    break
+
+def create_workspace(args, config, basepath, workspace):
+    if args.directory:
+        workspacedir = os.path.abspath(args.directory)
+    else:
+        workspacedir = os.path.abspath(os.path.join(basepath, 'workspace'))
+    _create_workspace(workspacedir, config, basepath, args.create_only)
+
+def _create_workspace(workspacedir, config, basepath, create_only=False):
+    import bb
+
+    confdir = os.path.join(workspacedir, 'conf')
+    if os.path.exists(os.path.join(confdir, 'layer.conf')):
+        logger.info('Specified workspace already set up, leaving as-is')
+    else:
+        # Add a config file
+        bb.utils.mkdirhier(confdir)
+        with open(os.path.join(confdir, 'layer.conf'), 'w') as f:
+            f.write('# ### workspace layer auto-generated by devtool ###\n')
+            f.write('BBPATH =. "$' + '{LAYERDIR}:"\n')
+            f.write('BBFILES += "$' + '{LAYERDIR}/recipes/*/*.bb \\\n')
+            f.write('            $' + '{LAYERDIR}/appends/*.bbappend"\n')
+            f.write('BBFILE_COLLECTIONS += "workspacelayer"\n')
+            f.write('BBFILE_PATTERN_workspacelayer = "^$' + '{LAYERDIR}/"\n')
+            f.write('BBFILE_PATTERN_IGNORE_EMPTY_workspacelayer = "1"\n')
+            f.write('BBFILE_PRIORITY_workspacelayer = "99"\n')
+        # Add a README file
+        with open(os.path.join(workspacedir, 'README'), 'w') as f:
+            f.write('This layer was created by the OpenEmbedded devtool utility in order to\n')
+            f.write('contain recipes and bbappends. In most instances you should use the\n')
+            f.write('devtool utility to manage files within it rather than modifying files\n')
+            f.write('directly (although recipes added with "devtool add" will often need\n')
+            f.write('direct modification.)\n')
+            f.write('\nIf you no longer need to use devtool you can remove the path to this\n')
+            f.write('workspace layer from your conf/bblayers.conf file (and then delete the\n')
+            f.write('layer, if you wish).\n')
+    if not create_only:
+        # Add the workspace layer to bblayers.conf
+        bblayers_conf = os.path.join(basepath, 'conf', 'bblayers.conf')
+        if not os.path.exists(bblayers_conf):
+            logger.error('Unable to find bblayers.conf')
+            return -1
+        bb.utils.edit_bblayers_conf(bblayers_conf, workspacedir, config.workspace_path)
+        if config.workspace_path != workspacedir:
+            # Update our config to point to the new location
+            config.workspace_path = workspacedir
+            config.write()
+
+
+def main():
+    global basepath
+    global config
+    global context
+
+    context = Context(fixed_setup=False)
+
+    # Default basepath
+    basepath = os.path.dirname(os.path.abspath(__file__))
+    pth = basepath
+    while pth != '' and pth != os.sep:
+        if os.path.exists(os.path.join(pth, '.devtoolbase')):
+            context.fixed_setup = True
+            basepath = pth
+            break
+        pth = os.path.dirname(pth)
+
+    parser = argparse.ArgumentParser(description="OpenEmbedded development tool",
+                                     epilog="Use %(prog)s <command> --help to get help on a specific command")
+    parser.add_argument('--basepath', help='Base directory of SDK / build directory')
+    parser.add_argument('-d', '--debug', help='Enable debug output', action='store_true')
+    parser.add_argument('-q', '--quiet', help='Print only errors', action='store_true')
+    parser.add_argument('--color', help='Colorize output', choices=['auto', 'always', 'never'], default='auto')
+
+    subparsers = parser.add_subparsers(dest="subparser_name")
+
+    if not context.fixed_setup:
+        parser_create_workspace = subparsers.add_parser('create-workspace', help='Set up a workspace')
+        parser_create_workspace.add_argument('directory', nargs='?', help='Directory for the workspace')
+        parser_create_workspace.add_argument('--create-only', action="store_true", help='Only create the workspace, do not alter configuration')
+        parser_create_workspace.set_defaults(func=create_workspace)
+
+    scriptutils.load_plugins(logger, plugins, os.path.join(scripts_path, 'lib', 'devtool'))
+    for plugin in plugins:
+        if hasattr(plugin, 'register_commands'):
+            plugin.register_commands(subparsers, context)
+
+    args = parser.parse_args()
+
+    if args.debug:
+        logger.setLevel(logging.DEBUG)
+    elif args.quiet:
+        logger.setLevel(logging.ERROR)
+
+    if args.basepath:
+        # Override
+        basepath = args.basepath
+    elif not context.fixed_setup:
+        basepath = os.environ.get('BUILDDIR')
+        if not basepath:
+            logger.error("This script can only be run after initialising the build environment (e.g. by using oe-init-build-env)")
+            sys.exit(1)
+
+    logger.debug('Using basepath %s' % basepath)
+
+    config = ConfigHandler(os.path.join(basepath, 'conf', 'devtool.conf'))
+    if not config.read():
+        return -1
+
+    bitbake_subdir = config.get('General', 'bitbake_subdir', '')
+    if bitbake_subdir:
+        # Normally set for use within the SDK
+        logger.debug('Using bitbake subdir %s' % bitbake_subdir)
+        sys.path.insert(0, os.path.join(basepath, bitbake_subdir, 'lib'))
+        core_meta_subdir = config.get('General', 'core_meta_subdir')
+        sys.path.insert(0, os.path.join(basepath, core_meta_subdir, 'lib'))
+    else:
+        # Standard location
+        import scriptpath
+        bitbakepath = scriptpath.add_bitbake_lib_path()
+        if not bitbakepath:
+            logger.error("Unable to find bitbake by searching parent directory of this script or PATH")
+            sys.exit(1)
+        logger.debug('Using standard bitbake path %s' % bitbakepath)
+        scriptpath.add_oe_lib_path()
+
+    scriptutils.logger_setup_color(logger, args.color)
+
+    if args.subparser_name != 'create-workspace':
+        read_workspace()
+
+    ret = args.func(args, config, basepath, workspace)
+
+    return ret
+
+
+if __name__ == "__main__":
+    try:
+        ret = main()
+    except Exception:
+        ret = 1
+        import traceback
+        traceback.print_exc(5)
+    sys.exit(ret)
diff --git a/scripts/lib/devtool/__init__.py b/scripts/lib/devtool/__init__.py
new file mode 100644
index 0000000..3f8158e
--- /dev/null
+++ b/scripts/lib/devtool/__init__.py
@@ -0,0 +1,78 @@
+#!/usr/bin/env python
+
+# Development tool - utility functions for plugins
+#
+# Copyright (C) 2014 Intel Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+
+import os
+import sys
+import subprocess
+import logging
+
+logger = logging.getLogger('devtool')
+
+def exec_build_env_command(init_path, builddir, cmd, watch=False, **options):
+    import bb
+    if not 'cwd' in options:
+        options["cwd"] = builddir
+    if init_path:
+        logger.debug('Executing command: "%s" using init path %s' % (cmd, init_path))
+        init_prefix = '. %s %s > /dev/null && ' % (init_path, builddir)
+    else:
+        logger.debug('Executing command "%s"' % cmd)
+        init_prefix = ''
+    if watch:
+        if sys.stdout.isatty():
+            # Fool bitbake into thinking it's outputting to a terminal (because it is, indirectly)
+            cmd = 'script -q -c "%s" /dev/null' % cmd
+        return exec_watch('%s%s' % (init_prefix, cmd), **options)
+    else:
+        return bb.process.run('%s%s' % (init_prefix, cmd), **options)
+
+def exec_watch(cmd, **options):
+    if isinstance(cmd, basestring) and not "shell" in options:
+        options["shell"] = True
+
+    process = subprocess.Popen(
+        cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, **options
+    )
+
+    buf = ''
+    while True:
+        out = process.stdout.read(1)
+        if out:
+            sys.stdout.write(out)
+            sys.stdout.flush()
+            buf += out
+        elif out == '' and process.poll() != None:
+            break
+    return buf
+
+def setup_tinfoil():
+    import scriptpath
+    bitbakepath = scriptpath.add_bitbake_lib_path()
+    if not bitbakepath:
+        logger.error("Unable to find bitbake by searching parent directory of this script or PATH")
+        sys.exit(1)
+
+    import bb.tinfoil
+    import logging
+    tinfoil = bb.tinfoil.Tinfoil()
+    tinfoil.prepare(False)
+    tinfoil.logger.setLevel(logging.WARNING)
+    return tinfoil
+
diff --git a/scripts/lib/devtool/standard.py b/scripts/lib/devtool/standard.py
new file mode 100644
index 0000000..69bb228
--- /dev/null
+++ b/scripts/lib/devtool/standard.py
@@ -0,0 +1,545 @@
+# Development tool - standard commands plugin
+#
+# Copyright (C) 2014 Intel Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import os
+import sys
+import re
+import shutil
+import glob
+import tempfile
+import logging
+import argparse
+from devtool import exec_build_env_command, setup_tinfoil
+
+logger = logging.getLogger('devtool')
+
+def plugin_init(pluginlist):
+    pass
+
+
+def add(args, config, basepath, workspace):
+    import bb
+    import oe.recipeutils
+
+    if args.recipename in workspace:
+        logger.error("recipe %s is already in your workspace" % args.recipename)
+        return -1
+
+    reason = oe.recipeutils.validate_pn(args.recipename)
+    if reason:
+        logger.error(reason)
+        return -1
+
+    srctree = os.path.abspath(args.srctree)
+    appendpath = os.path.join(config.workspace_path, 'appends')
+    if not os.path.exists(appendpath):
+        os.makedirs(appendpath)
+
+    recipedir = os.path.join(config.workspace_path, 'recipes', args.recipename)
+    bb.utils.mkdirhier(recipedir)
+    if args.version:
+        if '_' in args.version or ' ' in args.version:
+            logger.error('Invalid version string "%s"' % args.version)
+            return -1
+        bp = "%s_%s" % (args.recipename, args.version)
+    else:
+        bp = args.recipename
+    recipefile = os.path.join(recipedir, "%s.bb" % bp)
+    if sys.stdout.isatty():
+        color = 'always'
+    else:
+        color = args.color
+    stdout, stderr = exec_build_env_command(config.init_path, basepath, 'recipetool --color=%s create -o %s %s' % (color, recipefile, srctree))
+    logger.info('Recipe %s has been automatically created; further editing may be required to make it fully functional' % recipefile)
+
+    _add_md5(config, args.recipename, recipefile)
+
+    initial_rev = None
+    if os.path.exists(os.path.join(srctree, '.git')):
+        (stdout, _) = bb.process.run('git rev-parse HEAD', cwd=srctree)
+        initial_rev = stdout.rstrip()
+
+    appendfile = os.path.join(appendpath, '%s.bbappend' % args.recipename)
+    with open(appendfile, 'w') as f:
+        f.write('inherit externalsrc\n')
+        f.write('EXTERNALSRC = "%s"\n' % srctree)
+        if initial_rev:
+            f.write('\n# initial_rev: %s\n' % initial_rev)
+
+    _add_md5(config, args.recipename, appendfile)
+
+    return 0
+
+
+def _get_recipe_file(cooker, pn):
+    import oe.recipeutils
+    recipefile = oe.recipeutils.pn_to_recipe(cooker, pn)
+    if not recipefile:
+        skipreasons = oe.recipeutils.get_unavailable_reasons(cooker, pn)
+        if skipreasons:
+            logger.error('\n'.join(skipreasons))
+        else:
+            logger.error("Unable to find any recipe file matching %s" % pn)
+    return recipefile
+
+
+def extract(args, config, basepath, workspace):
+    import bb
+    import oe.recipeutils
+
+    tinfoil = setup_tinfoil()
+
+    recipefile = _get_recipe_file(tinfoil.cooker, args.recipename)
+    if not recipefile:
+        # Error already logged
+        return -1
+    rd = oe.recipeutils.parse_recipe(recipefile, tinfoil.config_data)
+
+    srctree = os.path.abspath(args.srctree)
+    initial_rev = _extract_source(srctree, args.keep_temp, args.branch, rd)
+    if initial_rev:
+        return 0
+    else:
+        return -1
+
+
+def _extract_source(srctree, keep_temp, devbranch, d):
+    import bb.event
+
+    def eventfilter(name, handler, event, d):
+        if name == 'base_eventhandler':
+            return True
+        else:
+            return False
+
+    if hasattr(bb.event, 'set_eventfilter'):
+        bb.event.set_eventfilter(eventfilter)
+
+    pn = d.getVar('PN', True)
+
+    if pn == 'perf':
+        logger.error("The perf recipe does not actually check out source and thus cannot be supported by this tool")
+        return None
+
+    if 'work-shared' in d.getVar('S', True):
+        logger.error("The %s recipe uses a shared workdir which this tool does not currently support" % pn)
+        return None
+
+    if bb.data.inherits_class('externalsrc', d) and d.getVar('EXTERNALSRC', True):
+        logger.error("externalsrc is currently enabled for the %s recipe. This prevents the normal do_patch task from working. You will need to disable this first." % pn)
+        return None
+
+    if os.path.exists(srctree):
+        if not os.path.isdir(srctree):
+            logger.error("output path %s exists and is not a directory" % srctree)
+            return None
+        elif os.listdir(srctree):
+            logger.error("output path %s already exists and is non-empty" % srctree)
+            return None
+
+    # Prepare for shutil.move later on
+    bb.utils.mkdirhier(srctree)
+    os.rmdir(srctree)
+
+    initial_rev = None
+    tempdir = tempfile.mkdtemp(prefix='devtool')
+    try:
+        crd = d.createCopy()
+        # Make a subdir so we guard against WORKDIR==S
+        workdir = os.path.join(tempdir, 'workdir')
+        crd.setVar('WORKDIR', workdir)
+        crd.setVar('T', os.path.join(tempdir, 'temp'))
+
+        # FIXME: This is very awkward. Unfortunately it's not currently easy to properly
+        # execute tasks outside of bitbake itself, until then this has to suffice if we
+        # are to handle e.g. linux-yocto's extra tasks
+        executed = []
+        def exec_task_func(func, report):
+            if not func in executed:
+                deps = crd.getVarFlag(func, 'deps')
+                if deps:
+                    for taskdepfunc in deps:
+                        exec_task_func(taskdepfunc, True)
+                if report:
+                    logger.info('Executing %s...' % func)
+                fn = d.getVar('FILE', True)
+                localdata = bb.build._task_data(fn, func, crd)
+                bb.build.exec_func(func, localdata)
+                executed.append(func)
+
+        logger.info('Fetching %s...' % pn)
+        exec_task_func('do_fetch', False)
+        logger.info('Unpacking...')
+        exec_task_func('do_unpack', False)
+        srcsubdir = crd.getVar('S', True)
+        if srcsubdir != workdir and os.path.dirname(srcsubdir) != workdir:
+            # Handle if S is set to a subdirectory of the source
+            srcsubdir = os.path.join(workdir, os.path.relpath(srcsubdir, workdir).split(os.sep)[0])
+
+        patchdir = os.path.join(srcsubdir, 'patches')
+        haspatches = False
+        if os.path.exists(patchdir):
+            if os.listdir(patchdir):
+                haspatches = True
+            else:
+                os.rmdir(patchdir)
+
+        if not bb.data.inherits_class('kernel-yocto', d):
+            if not os.listdir(srcsubdir):
+                logger.error("no source unpacked to S, perhaps the %s recipe doesn't use any source?" % pn)
+                return None
+
+            if not os.path.exists(os.path.join(srcsubdir, '.git')):
+                bb.process.run('git init', cwd=srcsubdir)
+                bb.process.run('git add .', cwd=srcsubdir)
+                bb.process.run('git commit -q -m "Initial commit from upstream at version %s"' % crd.getVar('PV', True), cwd=srcsubdir)
+
+            (stdout, _) = bb.process.run('git rev-parse HEAD', cwd=srcsubdir)
+            initial_rev = stdout.rstrip()
+
+            bb.process.run('git checkout -b %s' % devbranch, cwd=srcsubdir)
+            bb.process.run('git tag -f devtool-base', cwd=srcsubdir)
+
+            crd.setVar('PATCHTOOL', 'git')
+
+        logger.info('Patching...')
+        exec_task_func('do_patch', False)
+
+        bb.process.run('git tag -f devtool-patched', cwd=srcsubdir)
+
+        if os.path.exists(patchdir):
+            shutil.rmtree(patchdir)
+            if haspatches:
+                bb.process.run('git checkout patches', cwd=srcsubdir)
+
+        shutil.move(srcsubdir, srctree)
+        logger.info('Source tree extracted to %s' % srctree)
+    finally:
+        if keep_temp:
+            logger.info('Preserving temporary directory %s' % tempdir)
+        else:
+            shutil.rmtree(tempdir)
+    return initial_rev
+
+def _add_md5(config, recipename, filename):
+    import bb.utils
+    md5 = bb.utils.md5_file(filename)
+    with open(os.path.join(config.workspace_path, '.devtool_md5'), 'a') as f:
+        f.write('%s|%s|%s\n' % (recipename, os.path.relpath(filename, config.workspace_path), md5))
+
+def _check_preserve(config, recipename):
+    import bb.utils
+    origfile = os.path.join(config.workspace_path, '.devtool_md5')
+    newfile = os.path.join(config.workspace_path, '.devtool_md5_new')
+    preservepath = os.path.join(config.workspace_path, 'attic')
+    with open(origfile, 'r') as f:
+        with open(newfile, 'w') as tf:
+            for line in f.readlines():
+                splitline = line.rstrip().split('|')
+                if splitline[0] == recipename:
+                    removefile = os.path.join(config.workspace_path, splitline[1])
+                    md5 = bb.utils.md5_file(removefile)
+                    if splitline[2] != md5:
+                        bb.utils.mkdirhier(preservepath)
+                        preservefile = os.path.basename(removefile)
+                        logger.warn('File %s modified since it was written, preserving in %s' % (preservefile, preservepath))
+                        shutil.move(removefile, os.path.join(preservepath, preservefile))
+                    else:
+                        os.remove(removefile)
+                else:
+                    tf.write(line)
+    os.rename(newfile, origfile)
+
+    return False
+
+
+def modify(args, config, basepath, workspace):
+    import bb
+    import oe.recipeutils
+
+    if args.recipename in workspace:
+        logger.error("recipe %s is already in your workspace" % args.recipename)
+        return -1
+
+    if not args.extract:
+        if not os.path.isdir(args.srctree):
+            logger.error("directory %s does not exist or not a directory (specify -x to extract source from recipe)" % args.srctree)
+            return -1
+
+    tinfoil = setup_tinfoil()
+
+    recipefile = _get_recipe_file(tinfoil.cooker, args.recipename)
+    if not recipefile:
+        # Error already logged
+        return -1
+    rd = oe.recipeutils.parse_recipe(recipefile, tinfoil.config_data)
+
+    initial_rev = None
+    commits = []
+    srctree = os.path.abspath(args.srctree)
+    if args.extract:
+        initial_rev = _extract_source(args.srctree, False, args.branch, rd)
+        if not initial_rev:
+            return -1
+        # Get list of commits since this revision
+        (stdout, _) = bb.process.run('git rev-list --reverse %s..HEAD' % initial_rev, cwd=args.srctree)
+        commits = stdout.split()
+    else:
+        if os.path.exists(os.path.join(args.srctree, '.git')):
+            (stdout, _) = bb.process.run('git rev-parse HEAD', cwd=args.srctree)
+            initial_rev = stdout.rstrip()
+
+    # Handle if S is set to a subdirectory of the source
+    s = rd.getVar('S', True)
+    workdir = rd.getVar('WORKDIR', True)
+    if s != workdir and os.path.dirname(s) != workdir:
+        srcsubdir = os.sep.join(os.path.relpath(s, workdir).split(os.sep)[1:])
+        srctree = os.path.join(srctree, srcsubdir)
+
+    appendpath = os.path.join(config.workspace_path, 'appends')
+    if not os.path.exists(appendpath):
+        os.makedirs(appendpath)
+
+    appendname = os.path.splitext(os.path.basename(recipefile))[0]
+    if args.wildcard:
+        appendname = re.sub(r'_.*', '_%', appendname)
+    appendfile = os.path.join(appendpath, appendname + '.bbappend')
+    with open(appendfile, 'w') as f:
+        f.write('FILESEXTRAPATHS_prepend := "${THISDIR}/${PN}:"\n\n')
+        f.write('inherit externalsrc\n')
+        f.write('# NOTE: We use pn- overrides here to avoid affecting multiple variants in the case where the recipe uses BBCLASSEXTEND\n')
+        f.write('EXTERNALSRC_pn-%s = "%s"\n' % (args.recipename, srctree))
+        if bb.data.inherits_class('autotools-brokensep', rd):
+            logger.info('using source tree as build directory since original recipe inherits autotools-brokensep')
+            f.write('EXTERNALSRC_BUILD_pn-%s = "%s"\n' % (args.recipename, srctree))
+        if initial_rev:
+            f.write('\n# initial_rev: %s\n' % initial_rev)
+            for commit in commits:
+                f.write('# commit: %s\n' % commit)
+
+    _add_md5(config, args.recipename, appendfile)
+
+    logger.info('Recipe %s now set up to build from %s' % (args.recipename, srctree))
+
+    return 0
+
+
+def update_recipe(args, config, basepath, workspace):
+    if not args.recipename in workspace:
+        logger.error("no recipe named %s in your workspace" % args.recipename)
+        return -1
+
+    # Get initial revision from bbappend
+    appends = glob.glob(os.path.join(config.workspace_path, 'appends', '%s_*.bbappend' % args.recipename))
+    if not appends:
+        logger.error('unable to find workspace bbappend for recipe %s' % args.recipename)
+        return -1
+
+    tinfoil = setup_tinfoil()
+    import bb
+    from oe.patch import GitApplyTree
+    import oe.recipeutils
+
+    srctree = workspace[args.recipename]
+    commits = []
+    update_rev = None
+    if args.initial_rev:
+        initial_rev = args.initial_rev
+    else:
+        initial_rev = None
+        with open(appends[0], 'r') as f:
+            for line in f:
+                if line.startswith('# initial_rev:'):
+                    initial_rev = line.split(':')[-1].strip()
+                elif line.startswith('# commit:'):
+                    commits.append(line.split(':')[-1].strip())
+
+        if initial_rev:
+            # Find first actually changed revision
+            (stdout, _) = bb.process.run('git rev-list --reverse %s..HEAD' % initial_rev, cwd=srctree)
+            newcommits = stdout.split()
+            for i in xrange(min(len(commits), len(newcommits))):
+                if newcommits[i] == commits[i]:
+                    update_rev = commits[i]
+
+    if not initial_rev:
+        logger.error('Unable to find initial revision - please specify it with --initial-rev')
+        return -1
+
+    if not update_rev:
+        update_rev = initial_rev
+
+    # Find list of existing patches in recipe file
+    recipefile = _get_recipe_file(tinfoil.cooker, args.recipename)
+    if not recipefile:
+        # Error already logged
+        return -1
+    rd = oe.recipeutils.parse_recipe(recipefile, tinfoil.config_data)
+    existing_patches = oe.recipeutils.get_recipe_patches(rd)
+
+    removepatches = []
+    if not args.no_remove:
+        # Get all patches from source tree and check if any should be removed
+        tempdir = tempfile.mkdtemp(prefix='devtool')
+        try:
+            GitApplyTree.extractPatches(srctree, initial_rev, tempdir)
+            newpatches = os.listdir(tempdir)
+            for patch in existing_patches:
+                patchfile = os.path.basename(patch)
+                if patchfile not in newpatches:
+                    removepatches.append(patch)
+        finally:
+            shutil.rmtree(tempdir)
+
+    # Get updated patches from source tree
+    tempdir = tempfile.mkdtemp(prefix='devtool')
+    try:
+        GitApplyTree.extractPatches(srctree, update_rev, tempdir)
+
+        # Match up and replace existing patches with corresponding new patches
+        updatepatches = False
+        updaterecipe = False
+        newpatches = os.listdir(tempdir)
+        for patch in existing_patches:
+            patchfile = os.path.basename(patch)
+            if patchfile in newpatches:
+                logger.info('Updating patch %s' % patchfile)
+                shutil.move(os.path.join(tempdir, patchfile), patch)
+                newpatches.remove(patchfile)
+                updatepatches = True
+        srcuri = (rd.getVar('SRC_URI', False) or '').split()
+        if newpatches:
+            # Add any patches left over
+            patchdir = os.path.join(os.path.dirname(recipefile), rd.getVar('BPN', True))
+            bb.utils.mkdirhier(patchdir)
+            for patchfile in newpatches:
+                logger.info('Adding new patch %s' % patchfile)
+                shutil.move(os.path.join(tempdir, patchfile), os.path.join(patchdir, patchfile))
+                srcuri.append('file://%s' % patchfile)
+                updaterecipe = True
+        if removepatches:
+            # Remove any patches that we don't need
+            for patch in removepatches:
+                patchfile = os.path.basename(patch)
+                for i in xrange(len(srcuri)):
+                    if srcuri[i].startswith('file://') and os.path.basename(srcuri[i]).split(';')[0] == patchfile:
+                        logger.info('Removing patch %s' % patchfile)
+                        srcuri.pop(i)
+                        # FIXME "git rm" here would be nice if the file in question is tracked
+                        # FIXME there's a chance that this file is referred to by another recipe, in which case deleting wouldn't be the right thing to do
+                        os.remove(patch)
+                        updaterecipe = True
+                        break
+        if updaterecipe:
+            logger.info('Updating recipe %s' % os.path.basename(recipefile))
+            oe.recipeutils.patch_recipe(rd, recipefile, {'SRC_URI': ' '.join(srcuri)})
+        elif not updatepatches:
+            # Neither patches nor recipe were updated
+            logger.info('No patches need updating')
+    finally:
+        shutil.rmtree(tempdir)
+
+    return 0
+
+
+def status(args, config, basepath, workspace):
+    if workspace:
+        for recipe, value in workspace.iteritems():
+            print("%s: %s" % (recipe, value))
+    else:
+        logger.info('No recipes currently in your workspace - you can use "devtool modify" to work on an existing recipe or "devtool add" to add a new one')
+    return 0
+
+
+def reset(args, config, basepath, workspace):
+    import bb.utils
+    if not args.recipename in workspace:
+        logger.error("no recipe named %s in your workspace" % args.recipename)
+        return -1
+    _check_preserve(config, args.recipename)
+
+    preservepath = os.path.join(config.workspace_path, 'attic', args.recipename)
+    def preservedir(origdir):
+        if os.path.exists(origdir):
+            for fn in os.listdir(origdir):
+                logger.warn('Preserving %s in %s' % (fn, preservepath))
+                bb.utils.mkdirhier(preservepath)
+                shutil.move(os.path.join(origdir, fn), os.path.join(preservepath, fn))
+            os.rmdir(origdir)
+
+    preservedir(os.path.join(config.workspace_path, 'recipes', args.recipename))
+    # We don't automatically create this dir next to appends, but the user can
+    preservedir(os.path.join(config.workspace_path, 'appends', args.recipename))
+    return 0
+
+
+def build(args, config, basepath, workspace):
+    import bb
+    if not args.recipename in workspace:
+        logger.error("no recipe named %s in your workspace" % args.recipename)
+        return -1
+    exec_build_env_command(config.init_path, basepath, 'bitbake -c install %s' % args.recipename, watch=True)
+
+    return 0
+
+
+def register_commands(subparsers, context):
+    parser_add = subparsers.add_parser('add', help='Add a new recipe',
+                                       formatter_class=argparse.ArgumentDefaultsHelpFormatter)
+    parser_add.add_argument('recipename', help='Name for new recipe to add')
+    parser_add.add_argument('srctree', help='Path to external source tree')
+    parser_add.add_argument('--version', '-V', help='Version to use within recipe (PV)')
+    parser_add.set_defaults(func=add)
+
+    parser_add = subparsers.add_parser('modify', help='Modify the source for an existing recipe',
+                                       formatter_class=argparse.ArgumentDefaultsHelpFormatter)
+    parser_add.add_argument('recipename', help='Name for recipe to edit')
+    parser_add.add_argument('srctree', help='Path to external source tree')
+    parser_add.add_argument('--wildcard', '-w', action="store_true", help='Use wildcard for unversioned bbappend')
+    parser_add.add_argument('--extract', '-x', action="store_true", help='Extract source as well')
+    parser_add.add_argument('--branch', '-b', default="devtool", help='Name for development branch to checkout')
+    parser_add.set_defaults(func=modify)
+
+    parser_add = subparsers.add_parser('extract', help='Extract the source for an existing recipe',
+                                       formatter_class=argparse.ArgumentDefaultsHelpFormatter)
+    parser_add.add_argument('recipename', help='Name for recipe to extract the source for')
+    parser_add.add_argument('srctree', help='Path to where to extract the source tree')
+    parser_add.add_argument('--branch', '-b', default="devtool", help='Name for development branch to checkout')
+    parser_add.add_argument('--keep-temp', action="store_true", help='Keep temporary directory (for debugging)')
+    parser_add.set_defaults(func=extract)
+
+    parser_add = subparsers.add_parser('update-recipe', help='Apply changes from external source tree to recipe',
+                                       formatter_class=argparse.ArgumentDefaultsHelpFormatter)
+    parser_add.add_argument('recipename', help='Name of recipe to update')
+    parser_add.add_argument('--initial-rev', help='Starting revision for patches')
+    parser_add.add_argument('--no-remove', '-n', action="store_true", help='Don\'t remove patches, only add or update')
+    parser_add.set_defaults(func=update_recipe)
+
+    parser_status = subparsers.add_parser('status', help='Show status',
+                                          formatter_class=argparse.ArgumentDefaultsHelpFormatter)
+    parser_status.set_defaults(func=status)
+
+    parser_build = subparsers.add_parser('build', help='Build recipe',
+                                         formatter_class=argparse.ArgumentDefaultsHelpFormatter)
+    parser_build.add_argument('recipename', help='Recipe to build')
+    parser_build.set_defaults(func=build)
+
+    parser_reset = subparsers.add_parser('reset', help='Remove a recipe from your workspace',
+                                         formatter_class=argparse.ArgumentDefaultsHelpFormatter)
+    parser_reset.add_argument('recipename', help='Recipe to reset')
+    parser_reset.set_defaults(func=reset)
+
-- 
1.9.3



^ permalink raw reply related	[flat|nested] 17+ messages in thread

* [PATCH 14/15] scripts/devtool: Support deploy/undeploy function
  2014-12-19 11:41 [PATCH 00/15] Developer workflow tools Paul Eggleton
                   ` (12 preceding siblings ...)
  2014-12-19 11:41 ` [PATCH 13/15] scripts/devtool: add development helper tool Paul Eggleton
@ 2014-12-19 11:41 ` Paul Eggleton
  2014-12-19 11:41 ` [PATCH 15/15] devtool: add QA tests Paul Eggleton
  2014-12-23 12:30 ` [PATCH 00/15] Developer workflow tools Trevor Woerner
  15 siblings, 0 replies; 17+ messages in thread
From: Paul Eggleton @ 2014-12-19 11:41 UTC (permalink / raw)
  To: openembedded-core

From: Junchun Guan <junchunx.guan@intel.com>

Deploy recipe output files to live target machine using scp
Store the files list and target machine info in localhost if deployment
is done
Undeploy recipe output files in target machine using the previous
deployment info

[YOCTO #6654]

Signed-off-by: Junchun Guan <junchunx.guan@intel.com>
Signed-off-by: Paul Eggleton <paul.eggleton@linux.intel.com>
---
 scripts/lib/devtool/deploy.py | 100 ++++++++++++++++++++++++++++++++++++++++++
 1 file changed, 100 insertions(+)
 create mode 100644 scripts/lib/devtool/deploy.py

diff --git a/scripts/lib/devtool/deploy.py b/scripts/lib/devtool/deploy.py
new file mode 100644
index 0000000..bd23e95
--- /dev/null
+++ b/scripts/lib/devtool/deploy.py
@@ -0,0 +1,100 @@
+# Development tool - deploy/undeploy command plugin
+#
+# Copyright (C) 2014 Intel Corporation
+#
+# This program is free software; you can redistribute it and/or modify
+# it under the terms of the GNU General Public License version 2 as
+# published by the Free Software Foundation.
+#
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY; without even the implied warranty of
+# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+# GNU General Public License for more details.
+#
+# You should have received a copy of the GNU General Public License along
+# with this program; if not, write to the Free Software Foundation, Inc.,
+# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
+
+import os
+import subprocess
+import logging
+from devtool import exec_build_env_command
+
+logger = logging.getLogger('devtool')
+
+def plugin_init(pluginlist):
+    pass
+
+
+def deploy(args, config, basepath, workspace):
+    import re
+    from devtool import exec_build_env_command
+
+    if not args.recipename in workspace:
+        logger.error("no recipe named %s in your workspace" % args.recipename)
+        return -1
+    try:
+        host, destdir = args.target.split(':')
+    except ValueError:
+        destdir = '/'
+    else:
+        args.target = host
+
+    deploy_dir = os.path.join(basepath, 'target_deploy', args.target)
+    deploy_file = os.path.join(deploy_dir, args.recipename + '.list')
+
+    if os.path.exists(deploy_file):
+        undeploy(args)
+
+    stdout, stderr = exec_build_env_command(config.init_path, basepath, 'bitbake -e %s' % args.recipename, shell=True)
+    recipe_outdir = re.search(r'^D="(.*)"', stdout, re.MULTILINE).group(1)
+    ret = subprocess.call('scp -qr %s/* %s:%s' % (recipe_outdir, args.target, destdir), shell=True)
+    if ret != 0:
+        return ret
+
+    logger.info('Successfully deployed %s' % recipe_outdir)
+
+    if not os.path.exists(deploy_dir):
+        os.makedirs(deploy_dir)
+
+    files_list = []
+    for root, _, files in os.walk(recipe_outdir):
+        for filename in files:
+            filename = os.path.relpath(os.path.join(root, filename), recipe_outdir)
+            files_list.append(os.path.join(destdir, filename))
+
+    with open(deploy_file, 'w') as fobj:
+        fobj.write('\n'.join(files_list))
+
+    return 0
+
+def undeploy(args, config, basepath, workspace):
+
+    deploy_file = os.path.join(basepath, 'target_deploy', args.target, args.recipename + '.list')
+    if not os.path.exists(deploy_file):
+         logger.error('%s has not been deployed' % args.recipename)
+         return -1
+
+    ret = subprocess.call("scp -q %s %s:/tmp" % (deploy_file, args.target), shell=True)
+    if ret != 0:
+        logger.error('Failed to copy %s to %s' % (deploy, args.target))
+        return -1
+
+    ret = subprocess.call("ssh %s 'xargs -n1 rm -f </tmp/%s'" % (args.target, os.path.basename(deploy_file)), shell=True)
+    if ret == 0:
+        logger.info('Successfully undeployed %s' % args.recipename)
+        os.remove(deploy_file)
+
+    return ret
+
+
+def register_commands(subparsers, context):
+    parser_deploy = subparsers.add_parser('deploy-target', help='Deploy recipe output files to live target machine')
+    parser_deploy.add_argument('recipename', help='Recipe to deploy')
+    parser_deploy.add_argument('target', help='Live target machine running an ssh server: user@hostname[:destdir]')
+    parser_deploy.set_defaults(func=deploy)
+
+    parser_undeploy = subparsers.add_parser('undeploy-target', help='Undeploy recipe output files in live target machine')
+    parser_undeploy.add_argument('recipename', help='Recipe to undeploy')
+    parser_undeploy.add_argument('target', help='Live target machine running an ssh server: user@hostname')
+    parser_undeploy.set_defaults(func=undeploy)
-- 
1.9.3



^ permalink raw reply related	[flat|nested] 17+ messages in thread

* [PATCH 15/15] devtool: add QA tests
  2014-12-19 11:41 [PATCH 00/15] Developer workflow tools Paul Eggleton
                   ` (13 preceding siblings ...)
  2014-12-19 11:41 ` [PATCH 14/15] scripts/devtool: Support deploy/undeploy function Paul Eggleton
@ 2014-12-19 11:41 ` Paul Eggleton
  2014-12-23 12:30 ` [PATCH 00/15] Developer workflow tools Trevor Woerner
  15 siblings, 0 replies; 17+ messages in thread
From: Paul Eggleton @ 2014-12-19 11:41 UTC (permalink / raw)
  To: openembedded-core

Add some QA tests for devtool (and recipetool). These aren't
comprehensive but at least they are a start, and have already helped me
catch and fix a number of regressions.

Signed-off-by: Paul Eggleton <paul.eggleton@linux.intel.com>
---
 meta/lib/oeqa/selftest/devtool.py | 239 ++++++++++++++++++++++++++++++++++++++
 1 file changed, 239 insertions(+)
 create mode 100644 meta/lib/oeqa/selftest/devtool.py

diff --git a/meta/lib/oeqa/selftest/devtool.py b/meta/lib/oeqa/selftest/devtool.py
new file mode 100644
index 0000000..e8ff536
--- /dev/null
+++ b/meta/lib/oeqa/selftest/devtool.py
@@ -0,0 +1,239 @@
+import unittest
+import os
+import logging
+import re
+import shutil
+import tempfile
+import glob
+
+import oeqa.utils.ftools as ftools
+from oeqa.selftest.base import oeSelfTest
+from oeqa.utils.commands import runCmd, bitbake, get_bb_var
+from oeqa.utils.decorators import testcase
+
+class DevtoolTests(oeSelfTest):
+
+    def test_create_workspace(self):
+        # Check preconditions
+        workspacedir = os.path.join(self.builddir, 'workspace')
+        self.assertTrue(not os.path.exists(workspacedir), 'This test cannot be run with a workspace directory under the build directory')
+        result = runCmd('bitbake-layers show-layers')
+        self.assertTrue('/workspace' not in result.output, 'This test cannot be run with a workspace layer in bblayers.conf')
+        # Try creating a workspace layer with a specific path
+        tempdir = tempfile.mkdtemp(prefix='devtoolqa')
+        self.track_for_cleanup(tempdir)
+        result = runCmd('devtool create-workspace %s' % tempdir)
+        self.assertTrue(os.path.isfile(os.path.join(tempdir, 'conf', 'layer.conf')))
+        result = runCmd('bitbake-layers show-layers')
+        self.assertTrue(tempdir in result.output)
+        # Try creating a workspace layer with the default path
+        self.track_for_cleanup(workspacedir)
+        self.add_command_to_tearDown('bitbake-layers remove-layer */workspace')
+        result = runCmd('devtool create-workspace')
+        self.assertTrue(os.path.isfile(os.path.join(workspacedir, 'conf', 'layer.conf')))
+        result = runCmd('bitbake-layers show-layers')
+        self.assertTrue(tempdir not in result.output)
+        self.assertTrue(workspacedir in result.output)
+
+    def test_recipetool_create(self):
+        # Try adding a recipe
+        tempdir = tempfile.mkdtemp(prefix='devtoolqa')
+        self.track_for_cleanup(tempdir)
+        tempsrc = os.path.join(tempdir, 'srctree')
+        os.makedirs(tempsrc)
+        recipefile = os.path.join(tempdir, 'logrotate_3.8.7.bb')
+        srcuri = 'https://fedorahosted.org/releases/l/o/logrotate/logrotate-3.8.7.tar.gz'
+        result = runCmd('recipetool create -o %s %s -x %s' % (recipefile, srcuri, tempsrc))
+        self.assertTrue(os.path.isfile(recipefile))
+        checkvars = {}
+        checkvars['LICENSE'] = 'GPLv2'
+        checkvars['LIC_FILES_CHKSUM'] = 'file://COPYING;md5=18810669f13b87348459e611d31ab760'
+        checkvars['SRC_URI'] = 'https://fedorahosted.org/releases/l/o/logrotate/logrotate-${PV}.tar.gz'
+        checkvars['SRC_URI[md5sum]'] = '99e08503ef24c3e2e3ff74cc5f3be213'
+        checkvars['SRC_URI[sha256sum]'] = 'f6ba691f40e30e640efa2752c1f9499a3f9738257660994de70a45fe00d12b64'
+        with open(recipefile, 'r') as f:
+            for line in f:
+                if '=' in line:
+                    splitline = line.split('=', 1)
+                    var = splitline[0].rstrip()
+                    value = splitline[1].strip().strip('"')
+                    if var in checkvars:
+                        needvalue = checkvars.pop(var)
+                        self.assertEqual(value, needvalue)
+                if line.startswith('inherit '):
+                    inherits = line.split()[1:]
+
+        self.assertEqual(checkvars, {}, 'Some variables not found')
+
+    def test_recipetool_create_git(self):
+        # Try adding a recipe
+        tempdir = tempfile.mkdtemp(prefix='devtoolqa')
+        self.track_for_cleanup(tempdir)
+        tempsrc = os.path.join(tempdir, 'srctree')
+        os.makedirs(tempsrc)
+        recipefile = os.path.join(tempdir, 'libmatchbox.bb')
+        srcuri = 'git://git.yoctoproject.org/libmatchbox'
+        result = runCmd('recipetool create -o %s %s -x %s' % (recipefile, srcuri, tempsrc))
+        self.assertTrue(os.path.isfile(recipefile))
+        checkvars = {}
+        checkvars['LICENSE'] = 'LGPLv2.1'
+        checkvars['LIC_FILES_CHKSUM'] = 'file://COPYING;md5=7fbc338309ac38fefcd64b04bb903e34'
+        checkvars['S'] = '${WORKDIR}/git'
+        checkvars['PV'] = '1.0+git${SRCPV}'
+        checkvars['SRC_URI'] = srcuri
+        checkvars['DEPENDS'] = 'libpng pango libx11 libxext'
+        inherits = []
+        with open(recipefile, 'r') as f:
+            for line in f:
+                if '=' in line:
+                    splitline = line.split('=', 1)
+                    var = splitline[0].rstrip()
+                    value = splitline[1].strip().strip('"')
+                    if var in checkvars:
+                        needvalue = checkvars.pop(var)
+                        self.assertEqual(value, needvalue)
+                if line.startswith('inherit '):
+                    inherits = line.split()[1:]
+
+        self.assertEqual(checkvars, {}, 'Some variables not found')
+
+        self.assertIn('autotools', inherits, 'Missing inherit of autotools')
+        self.assertIn('pkgconfig', inherits, 'Missing inherit of pkgconfig')
+
+    def test_devtool_add(self):
+        # Check preconditions
+        workspacedir = os.path.join(self.builddir, 'workspace')
+        self.assertTrue(not os.path.exists(workspacedir), 'This test cannot be run with a workspace directory under the build directory')
+        # Fetch source
+        tempdir = tempfile.mkdtemp(prefix='devtoolqa')
+        self.track_for_cleanup(tempdir)
+        url = 'http://www.ivarch.com/programs/sources/pv-1.5.3.tar.bz2'
+        result = runCmd('wget %s' % url, cwd=tempdir)
+        result = runCmd('tar xfv pv-1.5.3.tar.bz2', cwd=tempdir)
+        srcdir = os.path.join(tempdir, 'pv-1.5.3')
+        self.assertTrue(os.path.isfile(os.path.join(srcdir, 'configure')))
+        # Test devtool add
+        self.track_for_cleanup(workspacedir)
+        self.add_command_to_tearDown('bitbake -c cleansstate pv')
+        self.add_command_to_tearDown('bitbake-layers remove-layer */workspace')
+        result = runCmd('devtool add pv %s' % srcdir)
+        self.assertTrue(os.path.exists(os.path.join(workspacedir, 'conf', 'layer.conf')), 'Workspace directory not created')
+        # Test devtool status
+        result = runCmd('devtool status')
+        self.assertIn('pv', result.output)
+        self.assertIn(srcdir, result.output)
+        # Clean up anything in the workdir/sysroot/sstate cache (have to do this *after* devtool add since the recipe only exists then)
+        bitbake('pv -c cleansstate')
+        # Test devtool build
+        result = runCmd('devtool build pv')
+        installdir = get_bb_var('D', 'pv')
+        self.assertTrue(installdir)
+        bindir = get_bb_var('bindir', 'pv')
+        self.assertTrue(bindir)
+        if bindir[0] == '/':
+            bindir = bindir[1:]
+        self.assertTrue(os.path.isfile(os.path.join(installdir, bindir, 'pv')), 'pv binary not found in D')
+
+    def test_devtool_modify(self):
+        # Check preconditions
+        workspacedir = os.path.join(self.builddir, 'workspace')
+        self.assertTrue(not os.path.exists(workspacedir), 'This test cannot be run with a workspace directory under the build directory')
+        # Clean up anything in the workdir/sysroot/sstate cache
+        bitbake('mdadm -c cleansstate')
+        # Try modifying a recipe
+        tempdir = tempfile.mkdtemp(prefix='devtoolqa')
+        self.track_for_cleanup(tempdir)
+        self.track_for_cleanup(workspacedir)
+        self.add_command_to_tearDown('bitbake-layers remove-layer */workspace')
+        self.add_command_to_tearDown('bitbake -c clean mdadm')
+        result = runCmd('devtool modify mdadm -x %s' % tempdir)
+        self.assertTrue(os.path.exists(os.path.join(tempdir, 'Makefile')), 'Extracted source could not be found')
+        self.assertTrue(os.path.isdir(os.path.join(tempdir, '.git')), 'git repository for external source tree not found')
+        self.assertTrue(os.path.exists(os.path.join(workspacedir, 'conf', 'layer.conf')), 'Workspace directory not created')
+        matches = glob.glob(os.path.join(workspacedir, 'appends', 'mdadm_*.bbappend'))
+        self.assertTrue(matches, 'bbappend not created')
+        # Test devtool status
+        result = runCmd('devtool status')
+        self.assertIn('mdadm', result.output)
+        self.assertIn(tempdir, result.output)
+        # Check git repo
+        result = runCmd('git status --porcelain', cwd=tempdir)
+        self.assertEqual(result.output.strip(), "", 'Created git repo is not clean')
+        result = runCmd('git symbolic-ref HEAD', cwd=tempdir)
+        self.assertEqual(result.output.strip(), "refs/heads/devtool", 'Wrong branch in git repo')
+        # Try building
+        bitbake('mdadm')
+        # Try making (minor) modifications to the source
+        result = runCmd("sed -i 's!^\.TH.*!.TH MDADM 8 \"\" v9.999-custom!' %s" % os.path.join(tempdir, 'mdadm.8.in'))
+        bitbake('mdadm -c package')
+        pkgd = get_bb_var('PKGD', 'mdadm')
+        self.assertTrue(pkgd)
+        mandir = get_bb_var('mandir', 'mdadm')
+        self.assertTrue(mandir)
+        if mandir[0] == '/':
+            mandir = mandir[1:]
+        with open(os.path.join(pkgd, mandir, 'man8', 'mdadm.8'), 'r') as f:
+            for line in f:
+                if line.startswith('.TH'):
+                    self.assertEqual(line.rstrip(), '.TH MDADM 8 "" v9.999-custom', 'man file not modified')
+        # Test devtool reset
+        result = runCmd('devtool reset mdadm')
+        result = runCmd('devtool status')
+        self.assertNotIn('mdadm', result.output)
+
+    def test_devtool_update_recipe(self):
+        # Check preconditions
+        workspacedir = os.path.join(self.builddir, 'workspace')
+        self.assertTrue(not os.path.exists(workspacedir), 'This test cannot be run with a workspace directory under the build directory')
+        testrecipe = 'minicom'
+        recipefile = get_bb_var('FILE', testrecipe)
+        result = runCmd('git status . --porcelain', cwd=os.path.dirname(recipefile))
+        self.assertEqual(result.output.strip(), "", '%s recipe is not clean' % testrecipe)
+        # First, modify a recipe
+        tempdir = tempfile.mkdtemp(prefix='devtoolqa')
+        self.track_for_cleanup(tempdir)
+        self.track_for_cleanup(workspacedir)
+        self.add_command_to_tearDown('bitbake-layers remove-layer */workspace')
+        # (don't bother with cleaning the recipe on teardown, we won't be building it)
+        result = runCmd('devtool modify %s -x %s' % (testrecipe, tempdir))
+        # Check git repo
+        self.assertTrue(os.path.isdir(os.path.join(tempdir, '.git')), 'git repository for external source tree not found')
+        result = runCmd('git status --porcelain', cwd=tempdir)
+        self.assertEqual(result.output.strip(), "", 'Created git repo is not clean')
+        result = runCmd('git symbolic-ref HEAD', cwd=tempdir)
+        self.assertEqual(result.output.strip(), "refs/heads/devtool", 'Wrong branch in git repo')
+        # Add a couple of commits
+        # FIXME: this only tests adding, need to also test update and remove
+        result = runCmd('echo "Additional line" >> README', cwd=tempdir)
+        result = runCmd('git commit -a -m "Change the README"', cwd=tempdir)
+        result = runCmd('echo "A new file" > devtool-new-file', cwd=tempdir)
+        result = runCmd('git add devtool-new-file', cwd=tempdir)
+        result = runCmd('git commit -m "Add a new file"', cwd=tempdir)
+        self.add_command_to_tearDown('cd %s; rm %s/*.patch; git checkout %s %s' % (os.path.dirname(recipefile), testrecipe, testrecipe, os.path.basename(recipefile)))
+        result = runCmd('devtool update-recipe %s' % testrecipe)
+        result = runCmd('git status . --porcelain', cwd=os.path.dirname(recipefile))
+        self.assertNotEqual(result.output.strip(), "", '%s recipe should be modified' % testrecipe)
+        status = result.output.splitlines()
+        self.assertEqual(len(status), 3, 'Less/more files modified than expected. Entire status:\n%s' % result.output)
+        for line in status:
+            if line.endswith('0001-Change-the-README.patch'):
+                self.assertEqual(line[:3], '?? ', 'Unexpected status in line: %s' % line)
+            elif line.endswith('0002-Add-a-new-file.patch'):
+                self.assertEqual(line[:3], '?? ', 'Unexpected status in line: %s' % line)
+            elif re.search('minicom_[^_]*.bb$', line):
+                self.assertEqual(line[:3], ' M ', 'Unexpected status in line: %s' % line)
+            else:
+                self.assertTrue(False, 'Unexpected modified file in status: %s' % line)
+
+    def test_devtool_extract(self):
+        # Check preconditions
+        workspacedir = os.path.join(self.builddir, 'workspace')
+        self.assertTrue(not os.path.exists(workspacedir), 'This test cannot be run with a workspace directory under the build directory')
+        tempdir = tempfile.mkdtemp(prefix='devtoolqa')
+        # Try devtool extract
+        self.track_for_cleanup(tempdir)
+        self.track_for_cleanup(workspacedir)
+        self.add_command_to_tearDown('bitbake-layers remove-layer */workspace')
+        result = runCmd('devtool extract remake %s' % tempdir)
+        self.assertTrue(os.path.exists(os.path.join(tempdir, 'Makefile.am')), 'Extracted source could not be found')
+        self.assertTrue(os.path.isdir(os.path.join(tempdir, '.git')), 'git repository for external source tree not found')
-- 
1.9.3



^ permalink raw reply related	[flat|nested] 17+ messages in thread

* Re: [PATCH 00/15] Developer workflow tools
  2014-12-19 11:41 [PATCH 00/15] Developer workflow tools Paul Eggleton
                   ` (14 preceding siblings ...)
  2014-12-19 11:41 ` [PATCH 15/15] devtool: add QA tests Paul Eggleton
@ 2014-12-23 12:30 ` Trevor Woerner
  15 siblings, 0 replies; 17+ messages in thread
From: Trevor Woerner @ 2014-12-23 12:30 UTC (permalink / raw)
  To: openembedded-core

I've written a quick tutorial about this work Paul and his co-workers
have been doing:
https://drive.google.com/file/d/0B3KGzY5fW7laQmgxVXVTSDJHeFU/view?usp=sharing

Tested-by: Trevor Woerner <trevor.woerner@linaro.org>


^ permalink raw reply	[flat|nested] 17+ messages in thread

end of thread, other threads:[~2014-12-23 12:30 UTC | newest]

Thread overview: 17+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2014-12-19 11:41 [PATCH 00/15] Developer workflow tools Paul Eggleton
2014-12-19 11:41 ` [PATCH 01/15] meta-environment: don't mark tasks as nostamp Paul Eggleton
2014-12-19 11:41 ` [PATCH 02/15] classes/package: move read_shlib_providers() to a common unit Paul Eggleton
2014-12-19 11:41 ` [PATCH 03/15] lib/oe/patch: fall back to patch if git apply fails Paul Eggleton
2014-12-19 11:41 ` [PATCH 04/15] lib/oe/patch: auto-commit when falling back from git am Paul Eggleton
2014-12-19 11:41 ` [PATCH 05/15] lib/oe/patch: use --keep-cr with " Paul Eggleton
2014-12-19 11:41 ` [PATCH 06/15] lib/oe/patch.py: abort "git am" if it fails Paul Eggleton
2014-12-19 11:41 ` [PATCH 07/15] lib/oe/patch: add support for extracting patches from git tree Paul Eggleton
2014-12-19 11:41 ` [PATCH 08/15] lib/oe: add recipeutils module Paul Eggleton
2014-12-19 11:41 ` [PATCH 09/15] oeqa/utils: make get_bb_var() more reliable Paul Eggleton
2014-12-19 11:41 ` [PATCH 10/15] classes/externalsrc: set do_compile as nostamp Paul Eggleton
2014-12-19 11:41 ` [PATCH 11/15] scripts/recipetool: Add a recipe auto-creation script Paul Eggleton
2014-12-19 11:41 ` [PATCH 12/15] scripts: add scriptutils module Paul Eggleton
2014-12-19 11:41 ` [PATCH 13/15] scripts/devtool: add development helper tool Paul Eggleton
2014-12-19 11:41 ` [PATCH 14/15] scripts/devtool: Support deploy/undeploy function Paul Eggleton
2014-12-19 11:41 ` [PATCH 15/15] devtool: add QA tests Paul Eggleton
2014-12-23 12:30 ` [PATCH 00/15] Developer workflow tools Trevor Woerner

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox