tools: update gyp-next to 0.18.2

PR-URL: https://github.com/nodejs/node/pull/55160
Reviewed-By: Luigi Pinca <luigipinca@gmail.com>
Reviewed-By: Chengzhong Wu <legendecas@gmail.com>
Reviewed-By: Jiawen Geng <technicalcute@gmail.com>
This commit is contained in:
Node.js GitHub Bot 2024-10-08 05:15:03 +02:00 committed by GitHub
parent ae1e2b53b7
commit 8dbca2d35b
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
18 changed files with 141 additions and 126 deletions

View File

@ -1,5 +1,13 @@
# Changelog
## [0.18.2](https://github.com/nodejs/gyp-next/compare/v0.18.1...v0.18.2) (2024-09-23)
### Bug Fixes
* do not assume that /usr/bin/env exists on macOS ([#216](https://github.com/nodejs/gyp-next/issues/216)) ([706d04a](https://github.com/nodejs/gyp-next/commit/706d04aba5bd18f311dc56f84720e99f64c73466))
* fix E721 lint errors ([#206](https://github.com/nodejs/gyp-next/issues/206)) ([d1299a4](https://github.com/nodejs/gyp-next/commit/d1299a49d313eccabecf97ccb56fc033afad39ad))
## [0.18.1](https://github.com/nodejs/gyp-next/compare/v0.18.0...v0.18.1) (2024-05-26)

View File

@ -34,7 +34,7 @@ See [Testing](Testing.md) for more details on the test framework.
Note that it can be handy to look at the project files output by the tests
to diagnose problems. The easiest way to do that is by kindly asking the
test driver to leave the temporary directories it creates in-place.
This is done by setting the enviroment variable "PRESERVE", e.g.
This is done by setting the environment variable "PRESERVE", e.g.
```
set PRESERVE=all # On Windows

View File

@ -157,7 +157,7 @@ have structural meaning for target definitions:
| `all_dependent_settings` | A dictionary of settings to be applied to all dependents of the target, transitively. This includes direct dependents and the entire set of their dependents, and so on. This section may contain anything found within a `target` dictionary, except `configurations`, `target_name`, and `type` sections. Compare `direct_dependent_settings` and `link_settings`. |
| `configurations` | A list of dictionaries defining build configurations for the target. See the "Configurations" section below. |
| `copies` | A list of copy actions to perform. See the "Copies" section below. |
| `defines` | A list of preprocesor definitions to be passed on the command line to the C/C++ compiler (via `-D` or `/D` options). |
| `defines` | A list of preprocessor definitions to be passed on the command line to the C/C++ compiler (via `-D` or `/D` options). |
| `dependencies` | A list of targets on which this target depends. Targets in other `.gyp` files are specified as `../path/to/other.gyp:target_we_want`. |
| `direct_dependent_settings` | A dictionary of settings to be applied to other targets that depend on this target. These settings will only be applied to direct dependents. This section may contain anything found within a `target` dictionary, except `configurations`, `target_name`, and `type` sections. Compare with `all_dependent_settings` and `link_settings`. |
| `include_dirs` | A list of include directories to be passed on the command line to the C/C++ compiler (via `-I` or `/I` options). |
@ -208,8 +208,8 @@ Configuration dictionaries may also contain these elements:
Conditionals may appear within any dictionary in a `.gyp` file. There
are two tpes of conditionals, which differ only in the timing of their
processing. `conditons` sections are processed shortly after loading
`.gyp` files, and `target_conditons` sections are processed after all
processing. `conditions` sections are processed shortly after loading
`.gyp` files, and `target_conditions` sections are processed after all
dependencies have been computed.
A conditional section is introduced with a `conditions` or

View File

@ -392,7 +392,7 @@ fails the test if it does.
Verifies that the output string contains all of the "lines" in the specified
list of lines. In practice, the lines can be any substring and need not be
`\n`-terminaed lines per se. If any line is missing, the test fails.
`\n`-terminated lines per se. If any line is missing, the test fails.
```
test.must_not_contain_any_lines(output, lines)
@ -400,7 +400,7 @@ list of lines. In practice, the lines can be any substring and need not be
Verifies that the output string does _not_ contain any of the "lines" in the
specified list of lines. In practice, the lines can be any substring and need
not be `\n`-terminaed lines per se. If any line exists in the output string,
not be `\n`-terminated lines per se. If any line exists in the output string,
the test fails.
```
@ -409,7 +409,7 @@ the test fails.
Verifies that the output string contains at least one of the "lines" in the
specified list of lines. In practice, the lines can be any substring and need
not be `\n`-terminaed lines per se. If none of the specified lines is present,
not be `\n`-terminated lines per se. If none of the specified lines is present,
the test fails.
### Reading file contents

View File

@ -104,7 +104,7 @@ describing all the information necessary to build the target.
`'conditions'`: A list of condition specifications that can modify the
contents of the items in the global dictionary defined by this `.gyp`
file based on the values of different variablwes. As implied by the
file based on the values of different variables. As implied by the
above example, the most common use of a `conditions` section in the
top-level dictionary is to add platform-specific targets to the
`targets` list.
@ -375,7 +375,7 @@ If your platform-specific file does not contain a
already in the `conditions` for the target), and you can't change the
file name, there are two patterns that can be used.
**Prefererred**: Add the file to the `sources` list of the appropriate
**Preferred**: Add the file to the `sources` list of the appropriate
dictionary within the `targets` list. Add an appropriate `conditions`
section to exclude the specific files name:
@ -807,7 +807,7 @@ directory:
```
Adding a library often involves updating multiple `.gyp` files, adding
the target to the approprate `.gyp` file (possibly a newly-added `.gyp`
the target to the appropriate `.gyp` file (possibly a newly-added `.gyp`
file), and updating targets in the other `.gyp` files that depend on
(link with) the new library.
@ -858,7 +858,7 @@ because of those settings' being listed in the
`direct_dependent_settings` block.
Note that these settings will likely need to be replicated in the
settings for the library target itsef, so that the library will build
settings for the library target itself, so that the library will build
with the same options. This does not prevent the target from defining
additional options for its "internal" use when compiling its own source
files. (In the above example, these are the `LOCAL_DEFINE_FOR_LIBBAR`

View File

@ -699,7 +699,7 @@ class TargetCalculator:
) & set(self._root_targets)
if matching_test_targets_contains_all:
# Remove any of the targets for all that were not explicitly supplied,
# 'all' is subsequentely added to the matching names below.
# 'all' is subsequently added to the matching names below.
matching_test_targets = list(
set(matching_test_targets) & set(test_targets_no_all)
)

View File

@ -769,7 +769,7 @@ class AndroidMkWriter:
Args:
cflags: A list of compiler flags, which may be mixed with "-I.."
Returns:
A tuple of lists: (clean_clfags, include_paths). "-I.." is trimmed.
A tuple of lists: (clean_cflags, include_paths). "-I.." is trimmed.
"""
clean_cflags = []
include_paths = []

View File

@ -251,7 +251,7 @@ def WriteActions(target_name, actions, extra_sources, extra_deps, path_to_gyp, o
target_name: the name of the CMake target being generated.
actions: the Gyp 'actions' dict for this target.
extra_sources: [(<cmake_src>, <src>)] to append with generated source files.
extra_deps: [<cmake_taget>] to append with generated targets.
extra_deps: [<cmake_target>] to append with generated targets.
path_to_gyp: relative path from CMakeLists.txt being generated to
the Gyp file in which the target being generated is defined.
"""
@ -340,7 +340,7 @@ def WriteRules(target_name, rules, extra_sources, extra_deps, path_to_gyp, outpu
target_name: the name of the CMake target being generated.
actions: the Gyp 'actions' dict for this target.
extra_sources: [(<cmake_src>, <src>)] to append with generated source files.
extra_deps: [<cmake_taget>] to append with generated targets.
extra_deps: [<cmake_target>] to append with generated targets.
path_to_gyp: relative path from CMakeLists.txt being generated to
the Gyp file in which the target being generated is defined.
"""
@ -457,7 +457,7 @@ def WriteCopies(target_name, copies, extra_deps, path_to_gyp, output):
Args:
target_name: the name of the CMake target being generated.
actions: the Gyp 'actions' dict for this target.
extra_deps: [<cmake_taget>] to append with generated targets.
extra_deps: [<cmake_target>] to append with generated targets.
path_to_gyp: relative path from CMakeLists.txt being generated to
the Gyp file in which the target being generated is defined.
"""
@ -603,7 +603,7 @@ class CMakeNamer:
"""
def __init__(self, target_list):
self.cmake_target_base_names_conficting = set()
self.cmake_target_base_names_conflicting = set()
cmake_target_base_names_seen = set()
for qualified_target in target_list:
@ -612,11 +612,11 @@ class CMakeNamer:
if cmake_target_base_name not in cmake_target_base_names_seen:
cmake_target_base_names_seen.add(cmake_target_base_name)
else:
self.cmake_target_base_names_conficting.add(cmake_target_base_name)
self.cmake_target_base_names_conflicting.add(cmake_target_base_name)
def CreateCMakeTargetName(self, qualified_target):
base_name = CreateCMakeTargetBaseName(qualified_target)
if base_name in self.cmake_target_base_names_conficting:
if base_name in self.cmake_target_base_names_conflicting:
return CreateCMakeTargetFullName(qualified_target)
return base_name

View File

@ -208,7 +208,7 @@ cmd_solink_module = $(LINK.$(TOOLSET)) -o $@ -shared $(GYP_LDFLAGS) $(LDFLAGS.$(
LINK_COMMANDS_MAC = """\
quiet_cmd_alink = LIBTOOL-STATIC $@
cmd_alink = rm -f $@ && ./gyp-mac-tool filter-libtool libtool $(GYP_LIBTOOLFLAGS) -static -o $@ $(filter %.o,$^)
cmd_alink = rm -f $@ && %(python)s gyp-mac-tool filter-libtool libtool $(GYP_LIBTOOLFLAGS) -static -o $@ $(filter %%.o,$^)
quiet_cmd_link = LINK($(TOOLSET)) $@
cmd_link = $(LINK.$(TOOLSET)) $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o "$@" $(LD_INPUTS) $(LIBS)
@ -218,7 +218,7 @@ cmd_solink = $(LINK.$(TOOLSET)) -shared $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o
quiet_cmd_solink_module = SOLINK_MODULE($(TOOLSET)) $@
cmd_solink_module = $(LINK.$(TOOLSET)) -bundle $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o $@ $(filter-out FORCE_DO_CMD, $^) $(LIBS)
""" # noqa: E501
""" % {'python': sys.executable} # noqa: E501
LINK_COMMANDS_ANDROID = """\
quiet_cmd_alink = AR($(TOOLSET)) $@
@ -609,14 +609,14 @@ cmd_pch_mm = $(CC.$(TOOLSET)) $(GYP_PCH_OBJCXXFLAGS) $(DEPFLAGS) -c -o $@ $<
# Use $(4) for the command, since $(2) and $(3) are used as flag by do_cmd
# already.
quiet_cmd_mac_tool = MACTOOL $(4) $<
cmd_mac_tool = ./gyp-mac-tool $(4) $< "$@"
cmd_mac_tool = %(python)s gyp-mac-tool $(4) $< "$@"
quiet_cmd_mac_package_framework = PACKAGE FRAMEWORK $@
cmd_mac_package_framework = ./gyp-mac-tool package-framework "$@" $(4)
cmd_mac_package_framework = %(python)s gyp-mac-tool package-framework "$@" $(4)
quiet_cmd_infoplist = INFOPLIST $@
cmd_infoplist = $(CC.$(TOOLSET)) -E -P -Wno-trigraphs -x c $(INFOPLIST_DEFINES) "$<" -o "$@"
""" # noqa: E501
""" % {'python': sys.executable} # noqa: E501
def WriteRootHeaderSuffixRules(writer):
@ -1834,7 +1834,7 @@ $(obj).$(TOOLSET)/$(TARGET)/%%.o: $(obj)/%%%s FORCE_DO_CMD
# Since this target depends on binary and resources which are in
# nested subfolders, the framework directory will be older than
# its dependencies usually. To prevent this rule from executing
# on every build (expensive, especially with postbuilds), expliclity
# on every build (expensive, especially with postbuilds), explicitly
# update the time on the framework directory.
self.WriteLn("\t@touch -c %s" % QuoteSpaces(self.output))
@ -2498,7 +2498,7 @@ def GenerateOutput(target_list, target_dicts, data, params):
"PLI.host": PLI_host,
}
if flavor == "mac":
flock_command = "./gyp-mac-tool flock"
flock_command = "%s gyp-mac-tool flock" % sys.executable
header_params.update(
{
"flock": flock_command,
@ -2548,7 +2548,7 @@ def GenerateOutput(target_list, target_dicts, data, params):
header_params.update(
{
"copy_archive_args": copy_archive_arguments,
"flock": "./gyp-flock-tool flock",
"flock": "%s gyp-flock-tool flock" % sys.executable,
"flock_index": 2,
}
)
@ -2564,7 +2564,7 @@ def GenerateOutput(target_list, target_dicts, data, params):
{
"copy_archive_args": copy_archive_arguments,
"link_commands": LINK_COMMANDS_AIX,
"flock": "./gyp-flock-tool flock",
"flock": "%s gyp-flock-tool flock" % sys.executable,
"flock_index": 2,
}
)
@ -2574,7 +2574,7 @@ def GenerateOutput(target_list, target_dicts, data, params):
{
"copy_archive_args": copy_archive_arguments,
"link_commands": LINK_COMMANDS_OS400,
"flock": "./gyp-flock-tool flock",
"flock": "%s gyp-flock-tool flock" % sys.executable,
"flock_index": 2,
}
)

View File

@ -276,7 +276,7 @@ def _ToolAppend(tools, tool_name, setting, value, only_if_unset=False):
def _ToolSetOrAppend(tools, tool_name, setting, value, only_if_unset=False):
# TODO(bradnelson): ugly hack, fix this more generally!!!
if "Directories" in setting or "Dependencies" in setting:
if type(value) == str:
if isinstance(value, str):
value = value.replace("/", "\\")
else:
value = [i.replace("/", "\\") for i in value]
@ -288,7 +288,7 @@ def _ToolSetOrAppend(tools, tool_name, setting, value, only_if_unset=False):
if tool.get(setting):
if only_if_unset:
return
if type(tool[setting]) == list and type(value) == list:
if isinstance(tool[setting], list) and isinstance(value, list):
tool[setting] += value
else:
raise TypeError(
@ -1423,7 +1423,7 @@ def _ConvertToolsToExpectedForm(tools):
# Collapse settings with lists.
settings_fixed = {}
for setting, value in settings.items():
if type(value) == list:
if isinstance(value, list):
if (
tool == "VCLinkerTool" and setting == "AdditionalDependencies"
) or setting == "AdditionalOptions":
@ -1816,7 +1816,7 @@ def _DictsToFolders(base_path, bucket, flat):
# Convert to folders recursively.
children = []
for folder, contents in bucket.items():
if type(contents) == dict:
if isinstance(contents, dict):
folder_children = _DictsToFolders(
os.path.join(base_path, folder), contents, flat
)
@ -1838,9 +1838,10 @@ def _CollapseSingles(parent, node):
# Recursively explorer the tree of dicts looking for projects which are
# the sole item in a folder which has the same name as the project. Bring
# such projects up one level.
if type(node) == dict and len(node) == 1 and next(iter(node)) == parent + ".vcproj":
if (isinstance(node, dict) and len(node) == 1 and
next(iter(node)) == parent + ".vcproj"):
return node[next(iter(node))]
if type(node) != dict:
if not isinstance(node, dict):
return node
for child in node:
node[child] = _CollapseSingles(child, node[child])
@ -1860,7 +1861,7 @@ def _GatherSolutionFolders(sln_projects, project_objects, flat):
# Walk down from the top until we hit a folder that has more than one entry.
# In practice, this strips the top-level "src/" dir from the hierarchy in
# the solution.
while len(root) == 1 and type(root[next(iter(root))]) == dict:
while len(root) == 1 and isinstance(root[next(iter(root))], dict):
root = root[next(iter(root))]
# Collapse singles.
root = _CollapseSingles("", root)
@ -3274,7 +3275,7 @@ def _GetMSBuildPropertyGroup(spec, label, properties):
num_configurations = len(spec["configurations"])
def GetEdges(node):
# Use a definition of edges such that user_of_variable -> used_varible.
# Use a definition of edges such that user_of_variable -> used_variable.
# This happens to be easier in this case, since a variable's
# definition contains all variables it references in a single string.
edges = set()
@ -3441,7 +3442,7 @@ def _FinalizeMSBuildSettings(spec, configuration):
def _GetValueFormattedForMSBuild(tool_name, name, value):
if type(value) == list:
if isinstance(value, list):
# For some settings, VS2010 does not automatically extends the settings
# TODO(jeanluc) Is this what we want?
if name in [

View File

@ -2595,9 +2595,9 @@ def GenerateOutputForConfig(target_list, target_dicts, data, params, config_name
"alink",
description="LIBTOOL-STATIC $out, POSTBUILDS",
command="rm -f $out && "
"./gyp-mac-tool filter-libtool libtool $libtool_flags "
"%s gyp-mac-tool filter-libtool libtool $libtool_flags "
"-static -o $out $in"
"$postbuilds",
"$postbuilds" % sys.executable,
)
master_ninja.rule(
"lipo",
@ -2698,41 +2698,44 @@ def GenerateOutputForConfig(target_list, target_dicts, data, params, config_name
master_ninja.rule(
"copy_infoplist",
description="COPY INFOPLIST $in",
command="$env ./gyp-mac-tool copy-info-plist $in $out $binary $keys",
command="$env %s gyp-mac-tool copy-info-plist $in $out $binary $keys"
% sys.executable,
)
master_ninja.rule(
"merge_infoplist",
description="MERGE INFOPLISTS $in",
command="$env ./gyp-mac-tool merge-info-plist $out $in",
command="$env %s gyp-mac-tool merge-info-plist $out $in" % sys.executable,
)
master_ninja.rule(
"compile_xcassets",
description="COMPILE XCASSETS $in",
command="$env ./gyp-mac-tool compile-xcassets $keys $in",
command="$env %s gyp-mac-tool compile-xcassets $keys $in" % sys.executable,
)
master_ninja.rule(
"compile_ios_framework_headers",
description="COMPILE HEADER MAPS AND COPY FRAMEWORK HEADERS $in",
command="$env ./gyp-mac-tool compile-ios-framework-header-map $out "
"$framework $in && $env ./gyp-mac-tool "
"copy-ios-framework-headers $framework $copy_headers",
command="$env %(python)s gyp-mac-tool compile-ios-framework-header-map "
"$out $framework $in && $env %(python)s gyp-mac-tool "
"copy-ios-framework-headers $framework $copy_headers"
% {'python': sys.executable},
)
master_ninja.rule(
"mac_tool",
description="MACTOOL $mactool_cmd $in",
command="$env ./gyp-mac-tool $mactool_cmd $in $out $binary",
command="$env %s gyp-mac-tool $mactool_cmd $in $out $binary"
% sys.executable,
)
master_ninja.rule(
"package_framework",
description="PACKAGE FRAMEWORK $out, POSTBUILDS",
command="./gyp-mac-tool package-framework $out $version$postbuilds "
"&& touch $out",
command="%s gyp-mac-tool package-framework $out $version$postbuilds "
"&& touch $out" % sys.executable,
)
master_ninja.rule(
"package_ios_framework",
description="PACKAGE IOS FRAMEWORK $out, POSTBUILDS",
command="./gyp-mac-tool package-ios-framework $out $postbuilds "
"&& touch $out",
command="%s gyp-mac-tool package-ios-framework $out $postbuilds "
"&& touch $out" % sys.executable,
)
if flavor == "win":
master_ninja.rule(

View File

@ -959,7 +959,7 @@ def GenerateOutput(target_list, target_dicts, data, params):
# would-be additional inputs are newer than the output. Modifying
# the source tree - even just modification times - feels dirty.
# 6564240 Xcode "custom script" build rules always dump all environment
# variables. This is a low-prioroty problem and is not a
# variables. This is a low-priority problem and is not a
# show-stopper.
rules_by_ext = {}
for rule in spec_rules:

View File

@ -242,7 +242,7 @@ def LoadOneBuildFile(build_file_path, data, aux_data, includes, is_target, check
gyp.common.ExceptionAppend(e, "while reading " + build_file_path)
raise
if type(build_file_data) is not dict:
if not isinstance(build_file_data, dict):
raise GypError("%s does not evaluate to a dictionary." % build_file_path)
data[build_file_path] = build_file_data
@ -303,20 +303,20 @@ def LoadBuildFileIncludesIntoDict(
# Recurse into subdictionaries.
for k, v in subdict.items():
if type(v) is dict:
if isinstance(v, dict):
LoadBuildFileIncludesIntoDict(v, subdict_path, data, aux_data, None, check)
elif type(v) is list:
elif isinstance(v, list):
LoadBuildFileIncludesIntoList(v, subdict_path, data, aux_data, check)
# This recurses into lists so that it can look for dicts.
def LoadBuildFileIncludesIntoList(sublist, sublist_path, data, aux_data, check):
for item in sublist:
if type(item) is dict:
if isinstance(item, dict):
LoadBuildFileIncludesIntoDict(
item, sublist_path, data, aux_data, None, check
)
elif type(item) is list:
elif isinstance(item, list):
LoadBuildFileIncludesIntoList(item, sublist_path, data, aux_data, check)
@ -350,9 +350,9 @@ def ProcessToolsetsInDict(data):
data["targets"] = new_target_list
if "conditions" in data:
for condition in data["conditions"]:
if type(condition) is list:
if isinstance(condition, list):
for condition_dict in condition[1:]:
if type(condition_dict) is dict:
if isinstance(condition_dict, dict):
ProcessToolsetsInDict(condition_dict)
@ -694,7 +694,7 @@ def IsStrCanonicalInt(string):
The canonical form is such that str(int(string)) == string.
"""
if type(string) is str:
if isinstance(string, str):
# This function is called a lot so for maximum performance, avoid
# involving regexps which would otherwise make the code much
# shorter. Regexps would need twice the time of this function.
@ -744,7 +744,7 @@ cached_command_results = {}
def FixupPlatformCommand(cmd):
if sys.platform == "win32":
if type(cmd) is list:
if isinstance(cmd, list):
cmd = [re.sub("^cat ", "type ", cmd[0])] + cmd[1:]
else:
cmd = re.sub("^cat ", "type ", cmd)
@ -870,7 +870,8 @@ def ExpandVariables(input, phase, variables, build_file):
# This works around actions/rules which have more inputs than will
# fit on the command line.
if file_list:
contents_list = contents if type(contents) is list else contents.split(" ")
contents_list = (contents if isinstance(contents, list)
else contents.split(" "))
replacement = contents_list[0]
if os.path.isabs(replacement):
raise GypError('| cannot handle absolute paths, got "%s"' % replacement)
@ -1011,7 +1012,7 @@ def ExpandVariables(input, phase, variables, build_file):
if isinstance(replacement, bytes) and not isinstance(replacement, str):
replacement = replacement.decode("utf-8") # done on Python 3 only
if type(replacement) is list:
if isinstance(replacement, list):
for item in replacement:
if isinstance(item, bytes) and not isinstance(item, str):
item = item.decode("utf-8") # done on Python 3 only
@ -1042,7 +1043,7 @@ def ExpandVariables(input, phase, variables, build_file):
# Expanding in list context. It's guaranteed that there's only one
# replacement to do in |input_str| and that it's this replacement. See
# above.
if type(replacement) is list:
if isinstance(replacement, list):
# If it's already a list, make a copy.
output = replacement[:]
else:
@ -1051,7 +1052,7 @@ def ExpandVariables(input, phase, variables, build_file):
else:
# Expanding in string context.
encoded_replacement = ""
if type(replacement) is list:
if isinstance(replacement, list):
# When expanding a list into string context, turn the list items
# into a string in a way that will work with a subprocess call.
#
@ -1081,8 +1082,8 @@ def ExpandVariables(input, phase, variables, build_file):
# expanding local variables (variables defined in the same
# variables block as this one).
gyp.DebugOutput(gyp.DEBUG_VARIABLES, "Found output %r, recursing.", output)
if type(output) is list:
if output and type(output[0]) is list:
if isinstance(output, list):
if output and isinstance(output[0], list):
# Leave output alone if it's a list of lists.
# We don't want such lists to be stringified.
pass
@ -1097,7 +1098,7 @@ def ExpandVariables(input, phase, variables, build_file):
output = ExpandVariables(output, phase, variables, build_file)
# Convert all strings that are canonically-represented integers into integers.
if type(output) is list:
if isinstance(output, list):
for index, outstr in enumerate(output):
if IsStrCanonicalInt(outstr):
output[index] = int(outstr)
@ -1115,7 +1116,7 @@ cached_conditions_asts = {}
def EvalCondition(condition, conditions_key, phase, variables, build_file):
"""Returns the dict that should be used or None if the result was
that nothing should be used."""
if type(condition) is not list:
if not isinstance(condition, list):
raise GypError(conditions_key + " must be a list")
if len(condition) < 2:
# It's possible that condition[0] won't work in which case this
@ -1133,12 +1134,12 @@ def EvalCondition(condition, conditions_key, phase, variables, build_file):
while i < len(condition):
cond_expr = condition[i]
true_dict = condition[i + 1]
if type(true_dict) is not dict:
if not isinstance(true_dict, dict):
raise GypError(
f"{conditions_key} {cond_expr} must be followed by a dictionary, "
f"not {type(true_dict)}"
)
if len(condition) > i + 2 and type(condition[i + 2]) is dict:
if len(condition) > i + 2 and isinstance(condition[i + 2], dict):
false_dict = condition[i + 2]
i = i + 3
if i != len(condition):
@ -1239,7 +1240,7 @@ def ProcessConditionsInDict(the_dict, phase, variables, build_file):
)
if merge_dict is not None:
# Expand variables and nested conditinals in the merge_dict before
# Expand variables and nested conditionals in the merge_dict before
# merging it.
ProcessVariablesAndConditionsInDict(
merge_dict, phase, variables, build_file
@ -1320,7 +1321,7 @@ def ProcessVariablesAndConditionsInDict(
for key, value in the_dict.items():
# Skip "variables", which was already processed if present.
if key != "variables" and type(value) is str:
if key != "variables" and isinstance(value, str):
expanded = ExpandVariables(value, phase, variables, build_file)
if type(expanded) not in (str, int):
raise ValueError(
@ -1383,21 +1384,21 @@ def ProcessVariablesAndConditionsInDict(
for key, value in the_dict.items():
# Skip "variables" and string values, which were already processed if
# present.
if key == "variables" or type(value) is str:
if key == "variables" or isinstance(value, str):
continue
if type(value) is dict:
if isinstance(value, dict):
# Pass a copy of the variables dict so that subdicts can't influence
# parents.
ProcessVariablesAndConditionsInDict(
value, phase, variables, build_file, key
)
elif type(value) is list:
elif isinstance(value, list):
# The list itself can't influence the variables dict, and
# ProcessVariablesAndConditionsInList will make copies of the variables
# dict if it needs to pass it to something that can influence it. No
# copy is necessary here.
ProcessVariablesAndConditionsInList(value, phase, variables, build_file)
elif type(value) is not int:
elif not isinstance(value, int):
raise TypeError("Unknown type " + value.__class__.__name__ + " for " + key)
@ -1406,17 +1407,17 @@ def ProcessVariablesAndConditionsInList(the_list, phase, variables, build_file):
index = 0
while index < len(the_list):
item = the_list[index]
if type(item) is dict:
if isinstance(item, dict):
# Make a copy of the variables dict so that it won't influence anything
# outside of its own scope.
ProcessVariablesAndConditionsInDict(item, phase, variables, build_file)
elif type(item) is list:
elif isinstance(item, list):
ProcessVariablesAndConditionsInList(item, phase, variables, build_file)
elif type(item) is str:
elif isinstance(item, str):
expanded = ExpandVariables(item, phase, variables, build_file)
if type(expanded) in (str, int):
the_list[index] = expanded
elif type(expanded) is list:
elif isinstance(expanded, list):
the_list[index : index + 1] = expanded
index += len(expanded)
@ -1431,7 +1432,7 @@ def ProcessVariablesAndConditionsInList(the_list, phase, variables, build_file):
+ " at "
+ index
)
elif type(item) is not int:
elif not isinstance(item, int):
raise TypeError(
"Unknown type " + item.__class__.__name__ + " at index " + index
)
@ -2232,18 +2233,18 @@ def MergeLists(to, fro, to_file, fro_file, is_paths=False, append=True):
# The cheap and easy case.
to_item = MakePathRelative(to_file, fro_file, item) if is_paths else item
if not (type(item) is str and item.startswith("-")):
if not (isinstance(item, str) and item.startswith("-")):
# Any string that doesn't begin with a "-" is a singleton - it can
# only appear once in a list, to be enforced by the list merge append
# or prepend.
singleton = True
elif type(item) is dict:
elif isinstance(item, dict):
# Make a copy of the dictionary, continuing to look for paths to fix.
# The other intelligent aspects of merge processing won't apply because
# item is being merged into an empty dict.
to_item = {}
MergeDicts(to_item, item, to_file, fro_file)
elif type(item) is list:
elif isinstance(item, list):
# Recurse, making a copy of the list. If the list contains any
# descendant dicts, path fixing will occur. Note that here, custom
# values for is_paths and append are dropped; those are only to be
@ -2312,12 +2313,12 @@ def MergeDicts(to, fro, to_file, fro_file):
to[k] = MakePathRelative(to_file, fro_file, v)
else:
to[k] = v
elif type(v) is dict:
elif isinstance(v, dict):
# Recurse, guaranteeing copies will be made of objects that require it.
if k not in to:
to[k] = {}
MergeDicts(to[k], v, to_file, fro_file)
elif type(v) is list:
elif isinstance(v, list):
# Lists in dicts can be merged with different policies, depending on
# how the key in the "from" dict (k, the from-key) is written.
#
@ -2361,7 +2362,7 @@ def MergeDicts(to, fro, to_file, fro_file):
# If the key ends in "?", the list will only be merged if it doesn't
# already exist.
continue
elif type(to[list_base]) is not list:
elif not isinstance(to[list_base], list):
# This may not have been checked above if merging in a list with an
# extension character.
raise TypeError(
@ -2542,7 +2543,7 @@ def ProcessListFiltersInDict(name, the_dict):
if operation not in {"!", "/"}:
continue
if type(value) is not list:
if not isinstance(value, list):
raise ValueError(
name + " key " + key + " must be list, not " + value.__class__.__name__
)
@ -2555,7 +2556,7 @@ def ProcessListFiltersInDict(name, the_dict):
del_lists.append(key)
continue
if type(the_dict[list_key]) is not list:
if not isinstance(the_dict[list_key], list):
value = the_dict[list_key]
raise ValueError(
name
@ -2668,17 +2669,17 @@ def ProcessListFiltersInDict(name, the_dict):
# Now recurse into subdicts and lists that may contain dicts.
for key, value in the_dict.items():
if type(value) is dict:
if isinstance(value, dict):
ProcessListFiltersInDict(key, value)
elif type(value) is list:
elif isinstance(value, list):
ProcessListFiltersInList(key, value)
def ProcessListFiltersInList(name, the_list):
for item in the_list:
if type(item) is dict:
if isinstance(item, dict):
ProcessListFiltersInDict(name, item)
elif type(item) is list:
elif isinstance(item, list):
ProcessListFiltersInList(name, item)
@ -2788,7 +2789,7 @@ def ValidateRunAsInTarget(target, target_dict, build_file):
run_as = target_dict.get("run_as")
if not run_as:
return
if type(run_as) is not dict:
if not isinstance(run_as, dict):
raise GypError(
"The 'run_as' in target %s from file %s should be a "
"dictionary." % (target_name, build_file)
@ -2799,19 +2800,19 @@ def ValidateRunAsInTarget(target, target_dict, build_file):
"The 'run_as' in target %s from file %s must have an "
"'action' section." % (target_name, build_file)
)
if type(action) is not list:
if not isinstance(action, list):
raise GypError(
"The 'action' for 'run_as' in target %s from file %s "
"must be a list." % (target_name, build_file)
)
working_directory = run_as.get("working_directory")
if working_directory and type(working_directory) is not str:
if working_directory and not isinstance(working_directory, str):
raise GypError(
"The 'working_directory' for 'run_as' in target %s "
"in file %s should be a string." % (target_name, build_file)
)
environment = run_as.get("environment")
if environment and type(environment) is not dict:
if environment and not isinstance(environment, dict):
raise GypError(
"The 'environment' for 'run_as' in target %s "
"in file %s should be a dictionary." % (target_name, build_file)
@ -2843,15 +2844,15 @@ def TurnIntIntoStrInDict(the_dict):
# Use items instead of iteritems because there's no need to try to look at
# reinserted keys and their associated values.
for k, v in the_dict.items():
if type(v) is int:
if isinstance(v, int):
v = str(v)
the_dict[k] = v
elif type(v) is dict:
elif isinstance(v, dict):
TurnIntIntoStrInDict(v)
elif type(v) is list:
elif isinstance(v, list):
TurnIntIntoStrInList(v)
if type(k) is int:
if isinstance(k, int):
del the_dict[k]
the_dict[str(k)] = v
@ -2860,11 +2861,11 @@ def TurnIntIntoStrInList(the_list):
"""Given list the_list, recursively converts all integers into strings.
"""
for index, item in enumerate(the_list):
if type(item) is int:
if isinstance(item, int):
the_list[index] = str(item)
elif type(item) is dict:
elif isinstance(item, dict):
TurnIntIntoStrInDict(item)
elif type(item) is list:
elif isinstance(item, list):
TurnIntIntoStrInList(item)

View File

@ -1174,8 +1174,9 @@ class XcodeSettings:
# Then re-sign everything with 'preserve=True'
postbuilds.extend(
[
'%s code-sign-bundle "%s" "%s" "%s" "%s" %s'
'%s %s code-sign-bundle "%s" "%s" "%s" "%s" %s'
% (
sys.executable,
os.path.join("${TARGET_BUILD_DIR}", "gyp-mac-tool"),
key,
settings.get("CODE_SIGN_ENTITLEMENTS", ""),
@ -1190,8 +1191,9 @@ class XcodeSettings:
for target in targets:
postbuilds.extend(
[
'%s code-sign-bundle "%s" "%s" "%s" "%s" %s'
'%s %s code-sign-bundle "%s" "%s" "%s" "%s" %s'
% (
sys.executable,
os.path.join("${TARGET_BUILD_DIR}", "gyp-mac-tool"),
key,
settings.get("CODE_SIGN_ENTITLEMENTS", ""),
@ -1204,8 +1206,9 @@ class XcodeSettings:
postbuilds.extend(
[
'%s code-sign-bundle "%s" "%s" "%s" "%s" %s'
'%s %s code-sign-bundle "%s" "%s" "%s" "%s" %s'
% (
sys.executable,
os.path.join("${TARGET_BUILD_DIR}", "gyp-mac-tool"),
key,
settings.get("CODE_SIGN_ENTITLEMENTS", ""),
@ -1858,7 +1861,7 @@ def _TopologicallySortedEnvVarKeys(env):
regex = re.compile(r"\$\{([a-zA-Z0-9\-_]+)\}")
def GetEdges(node):
# Use a definition of edges such that user_of_variable -> used_varible.
# Use a definition of edges such that user_of_variable -> used_variable.
# This happens to be easier in this case, since a variable's
# definition contains all variables it references in a single string.
# We can then reverse the result of the topological sort at the end.

View File

@ -74,7 +74,7 @@ layer of indirection between an XCBuildPhase and a PBXFileReference via a
PBXBuildFile appears extraneous, but there's actually one reason for this:
file-specific compiler flags are added to the PBXBuildFile object so as to
allow a single file to be a member of multiple targets while having distinct
compiler flags for each. These flags can be modified in the Xcode applciation
compiler flags for each. These flags can be modified in the Xcode application
in the "Build" tab of a File Info window.
When a project is open in the Xcode application, Xcode will rewrite it. As
@ -662,7 +662,7 @@ class XCObject:
tabs is an int identifying the indentation level. If the class'
_should_print_single_line variable is True, tabs is ignored and the
key-value pair will be followed by a space insead of a newline.
key-value pair will be followed by a space instead of a newline.
"""
if self._should_print_single_line:
@ -781,7 +781,7 @@ class XCObject:
# Make sure the property conforms to the schema.
(is_list, property_type, is_strong) = self._schema[property][0:3]
if is_list:
if value.__class__ != list:
if not isinstance(value, list):
raise TypeError(
property
+ " of "
@ -791,7 +791,7 @@ class XCObject:
)
for item in value:
if not isinstance(item, property_type) and not (
isinstance(item, str) and property_type == str
isinstance(item, str) and isinstance(property_type, str)
):
# Accept unicode where str is specified. str is treated as
# UTF-8-encoded.
@ -806,7 +806,7 @@ class XCObject:
+ item.__class__.__name__
)
elif not isinstance(value, property_type) and not (
isinstance(value, str) and property_type == str
isinstance(value, str) and isinstance(property_type, str)
):
# Accept unicode where str is specified. str is treated as
# UTF-8-encoded.
@ -2994,7 +2994,7 @@ class PBXProject(XCContainerPortal):
key=lambda x: x["ProjectRef"].Name().lower()
)
else:
# The link already exists. Pull out the relevnt data.
# The link already exists. Pull out the relevant data.
project_ref_dict = self._other_pbxprojects[other_pbxproject]
product_group = project_ref_dict["ProductGroup"]
project_ref = project_ref_dict["ProjectRef"]

View File

@ -145,7 +145,7 @@ class RawMetadata(TypedDict, total=False):
# Metadata 2.3 - PEP 685
# No new fields were added in PEP 685, just some edge case were
# tightened up to provide better interoptability.
# tightened up to provide better interoperability.
_STRING_FIELDS = {
@ -206,10 +206,10 @@ def _parse_project_urls(data: List[str]) -> Dict[str, str]:
# be the missing value, then they'd have multiple '' values that
# overwrite each other in a accumulating dict.
#
# The other potentional issue is that it's possible to have the
# The other potential issue is that it's possible to have the
# same label multiple times in the metadata, with no solid "right"
# answer with what to do in that case. As such, we'll do the only
# thing we can, which is treat the field as unparseable and add it
# thing we can, which is treat the field as unparsable and add it
# to our list of unparsed fields.
parts = [p.strip() for p in pair.split(",", 1)]
parts.extend([""] * (max(0, 2 - len(parts)))) # Ensure 2 items
@ -222,8 +222,8 @@ def _parse_project_urls(data: List[str]) -> Dict[str, str]:
label, url = parts
if label in urls:
# The label already exists in our set of urls, so this field
# is unparseable, and we can just add the whole thing to our
# unparseable data and stop processing it.
# is unparsable, and we can just add the whole thing to our
# unparsable data and stop processing it.
raise KeyError("duplicate labels in project urls")
urls[label] = url
@ -433,7 +433,7 @@ def parse_email(data: Union[bytes, str]) -> Tuple[RawMetadata, Dict[str, List[st
except KeyError:
unparsed[name] = value
# Nothing that we've done has managed to parse this, so it'll just
# throw it in our unparseable data and move on.
# throw it in our unparsable data and move on.
else:
unparsed[name] = value
@ -450,7 +450,7 @@ def parse_email(data: Union[bytes, str]) -> Tuple[RawMetadata, Dict[str, List[st
else:
if payload:
# Check to see if we've already got a description, if so then both
# it, and this body move to unparseable.
# it, and this body move to unparsable.
if "description" in raw:
description_header = cast(str, raw.pop("description"))
unparsed.setdefault("description", []).extend(

View File

@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"
[project]
name = "gyp-next"
version = "0.18.1"
version = "0.18.2"
authors = [
{ name="Node.js contributors", email="ryzokuken@disroot.org" },
]
@ -92,7 +92,6 @@ select = [
# "TRY", # tryceratops
]
ignore = [
"E721",
"PLC1901",
"PLR0402",
"PLR1714",

View File

@ -207,7 +207,7 @@ def CleanupVcproj(node):
node.appendChild(new_node)
def GetConfiguationNodes(vcproj):
def GetConfigurationNodes(vcproj):
# TODO(nsylvain): Find a better way to navigate the xml.
nodes = []
for node in vcproj.childNodes:
@ -307,7 +307,7 @@ def main(argv):
# First thing we need to do is find the Configuration Node and merge them
# with the vsprops they include.
for configuration_node in GetConfiguationNodes(dom.documentElement):
for configuration_node in GetConfigurationNodes(dom.documentElement):
# Get the property sheets associated with this configuration.
vsprops = configuration_node.getAttribute("InheritedPropertySheets")